Hi Bravo,
I belive there will be a growing trend towards robotics and that Brainchip will play a part.
If we do not play a part in the growing robotics industry (defense and civilian/industry) then there is a huge problem with our tech.
Megachips:
" This section describes the construction of the proposed framework shown in Fig. 2. We utilized a desktop PC equipped with a GPU (Nvidia RTX3090) for updating the policies and an
Akida Neural Processor SoC as a neurochip [9, 12]. The robot was controlled by the policies implemented in the neurochip.
SNNs were implemented to the neurochip by a conversion executed by the MetaTF of Akida that converts the software [9, 12]. Samples were collected by the SNN policies in both the simulation tasks and the real-robot tasks since the target task is neurochip-driven robot control."
The above was from a paper written jointly by the Nara Institute of Science and Technology and Megachips published on 23rd August August 2024. Both Nara and Megachips are domiciled in Japan.
Given that Megachips began demonstrating Robots in its Showroom in September 2025 it would be impossible for them to reject AKIDA after August'24, source a new chip, test and trial it and demonstrate it 13 months later.
The paper showed the success with AKIDA is shown through three types of evidence:
(1) the robot could be controlled entirely by the Akida neurochip,
(2) the SNN policies ran correctly under Akidaâs hardware constraints, and
(3) the system worked in both simulation and realârobot tasks.
Brainchip is also connected to Robotics via Chelpis-Mirle
BrainChip teams with Chelpis and Mirle to deliver next-generation security solutions powered by low-power, adaptive edge AI capabilities.
brainchip.com
" The M.2 card is based on a design from BrainChip along with an agreement to purchase a significant number of AKD1000 chips for qualification and deployment. Upon completion of this phase,
Chelpis is planning to increase its commitment with additional orders for the AKD1000. "
Arquimea
Brainchip Episode 40 interview with Arquimea reveals that the water safety was just the start. They are into using AKIDA for Robotics (defense and security as well) - Its quite clear from the interview. They were one of the first of our clients to run Gen2 on AKIDA Cloud. No doubt they will be interested in getting their hands on the AKIDA 2500 chip when available.
A lot of robotics talk especially towards the end.
From the 22.5 minute mark:
22.50 Several mile stones for Arquimea and Robotics (with Neuromorphic).
- Combine several different sensors using Neuromorphic.
-improve performance via benchmarking current tech and surpassing it with a new paradigm.
- from the above the most important is to significantly increase the agility and efficiency of the system.
- thinking about mobile robot with multinmodel solutions integrated with sensors, potentially developing their own sensors
- demonstrate it to prove agility and efficiency.
Steve Brightfield Interview;
The below is the text from an interview with Steve Brightfield lateish 2025. The talks about the great results Raytheon/AKIDA/US AFRL are achieving with Radar and how its applicable to Robotics and that they are talking to Roboltics companies.
"And I would imagine research is ongoing. What's around the corner or over the horizon?
One of the things that was interesting is we got a contract with Air Force research libraries to work on radar using these algorithms, right? And the results actually surprised us and they surprised the contracting agency and now we're expanding that.
And we think that we can, you know, add capabilities to radar that weren't there before.
Like, for example, radar can detect things, right? But it can't tell you what it is. Well, we can classify objects now with radar in addition to detecting them.
We can improve the tracking and the latency of these radars. But we can also make them a lot smaller, right? So it's that size weight and power.
Can I put a radar in a robot? So when it's hand has got a radar signal in it and it can basically navigate, you can paint the scene without a camera. You can use it like a camera to paint the scene and recognize and grasp things that a drone. You can fly it inside tunnels or buildings indoors. You can map out where you're going. We see this shrinking of the conventional radar technologies to really go into anything moving because it's all whether it works in the dark. And if it can replicate some of the things in vision, then, you know, you don't have to worry about rain and fog and all the issues that visual, you know, control of robots.
Yeah. And are you working with robotic companies or is this still in the research room? It's still in the research. We're working with companies that are creating components or solutions that go to the robotics companies. We are in active conversations with robotic companies today. And they're in evaluation of this, right? But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out. We'll build a little prototype. So we're doing reference designs and radar. We're also going to do this in these wearables."
My bold above.
What Steve said " But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out." is interesting.
It basically means that robotics companies donât want to be handed a raw algorithm and told to âfigure it out themselves.â
Instead, those companies want complete, working examplesâhardware + software + demo applicationsâthat show how Akida can be used in real products. Obviously its in the evaluation stage but we are doing this.
He is explaining a strategic shift in how Brainchip supports robotics and likely other industries.
Steve is talking about supplying to component makers of robots. The supply is the 'works' and not just a chip and an algorithm.