2. BrainChip BrainChip is a neuromorphic computing company that has developed the Akida chip, which supports both SNNs and convolutional neural networks (CNNs). The Akida chip is designed for edge AI applications and features a highly efficient architecture that allows it to perform tasks like embedded vision and audio processing while consuming minimal power.Are we not an IP company? Those companies produce chips. Maybe that’s why we’re not on the list![]()
Couldn't find this conference paper from 2024 posted but maybe my search didn't capture it.
Anyway, was a positive presentation at a IEEE conference and below are a couple of snips and as they advise, "This is an accepted manuscript version of a paper before final publisher editing and formatting. Archived with thanks to IEEE."
HERE
As the authors acknowledge:
"This work has been partially supported by King Abdullah University of Science and Technology CRG program under grant number: URF/1/4704-01-01.
We would like also to thank Edge Impulse and Brainchip companies for providing us with the software tools and hardware platform used during this work.*
One of the authors, M E Fouda, caught my eye given his/her relationship with "3", maybe employer...hmmmm.
D. A. Silva1
, A. Shymyrbay1
, K. Smagulova1
, A. Elsheikh2
, M. E. Fouda3,†
and A. M. Eltawil1
1 Department of ECE, CEMSE Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
2 Department of Mathematics and Engineering Physics, Faculty of Engineering, Cairo University, Giza 12613, Egypt
3 Rain Neuromorphics, Inc., San Francisco, CA, 94110, USA
†Email:foudam@uci.edu
End-to-End Edge Neuromorphic Object Detection System
Abstract—Neuromorphic accelerators are emerging as a potential solution to the growing power demands of Artificial
Intelligence (AI) applications. Spiking Neural Networks (SNNs), which are bio-inspired architectures, are being considered as a way to address this issue. Neuromorphic cameras, which operate on a similar principle, have also been developed, offering low power consumption, microsecond latency, and robustness in various lighting conditions.
This work presents a full neuromorphic
system for Computer Vision, from the camera to the processing hardware, with a focus on object detection. The system was evaluated on a compiled real-world dataset and a new synthetic dataset generated from existing videos, and it demonstrated good performance in both cases. The system was able to make accurate predictions while consuming 66mW, with a sparsity of 83%, and a time response of 138ms.
View attachment 77018
VI. CONCLUSION AND FUTURE WORK
This work showed a low-power and real-time latency full spiking neuromorphic system for object detection based on IniVation’s DVXplorer Lite event-based camera and Brainchip’s Akida AKD1000 spiking platform. The system was evaluated on three different datasets, comprising real-world and synthetic samples. The final mapped model achieved mAPs of 28.58 for the GEN1 dataset, equivalent to 54% of a more complex state of-the-art model and 89% of the performance detection from the best-reported result for the single-class dataset PEDRo, having 17x less parameters. A power consumption of 66mW and a latency of 138.88ms were reported, being suitable for real-time edge applications.
For future works, different models are expected to be adapted to the Akida platform, from which more recent
releases of the YOLO family can be implemented. Moreover, it is expected to evaluate those models in real-world scenarios instead of recordings, as well as the acquisition of more data to evaluate this setup under different challenging situations.
I fail to see why MB would choose Akida for a successful POC with the EQXX and then move on to test other NPs afterwards? We know that they had Loihi in their hands before Akida was chosen for the EQXX anyway. It’s just would be illogical for them to take backward steps ats, particularly as has been pointed out independently by the space industry and others in their testing that Akida is the first choice simply because it’s the only truly commercially available and fully supported NP currently on the market. AIMO.A fellow forum user who in recent months repeatedly referred to his brief LinkedIn exchange with Mercedes-Benz Chief Software Officer Magnus Östberg (and thereby willingly revealed his identity to all of us here on TSE, which in turn means I’m not guilty of spilling a secret with this post that should have been kept private), asked Mercedes-Benz a question in the comment section underneath the company’s latest LinkedIn post on neuromorphic computing. This time, however, he decided not to share the carmaker’s reply with all of us here on TSE. You gotta wonder why.
Could it possibly have to do with the fact that MB’s reply refutes the hypothesis he had been advancing for months, namely that Mercedes-Benz, who have been heavily promoting their future SDV (software defined vehicle) approach that gives them the option of OTA (over-the-air) updates, would “more than likely” have used Akida 2.0/TENNs simulation software in the upcoming MB.OS release as an interim solution during ongoing development until the not-yet existing Akida 2.0 silicon became available at a later stage? The underlying reason being competitive pressure to be first-to-market…
The way I see it, the January 29 reply by MB clearly puts this speculation to bed:
![]()
Neuromorphic Computing | Mercedes-Benz AG | 21 comments
The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware. That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations. This could revolutionise...www.linkedin.com
View attachment 77012
Does that sound as if an MB.OS “Akida inside” reveal at the upcoming world premiere of the CLA were on the cards?
Setting aside the questions
a) about any requirements for testing and certification of car parts making up the infotainment system (being used to German/EU bureaucracy, I find it hard to believe there wouldn’t be any at all - maybe someone who is knowledgeable about automotive regulations within Germany and the EU could comment on this) and
b) whether any new MB model containing our tech could roll off the production line despite no prior IP license deal having been signed (or at least an Akida 1.0 chips sales deal; there has never been a joint development announcement either which could possibly somehow circumvent the necessity of an upfront payment showing up in our financials)….
… various MB statements in recent months (cf. Dominik Blum’s presentation at HKA Karlsruhe I shared in October: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352 - the university of applied sciences they have since confirmed to cooperate with regarding research on neuromorphic cameras, journalists quoting MB engineers after having visited their Future Technologies Lab as well as relevant posts and comments on LinkedIn) have diminished the likelihood of neuromorphic tech making its debut in any soon to be released Mercedes-Benz models.
If NC were to enhance voice control and infotainment functions in their production vehicles much sooner than safety-critical ones (ADAS), MB would surely have clarified this in their reply to the above question posed to them on LinkedIn, which specifically referred to the soon-to-be released CLA, which happens to be the first model to come with the next-generation MB.OS that also boasts the new AI-powered MBUX Virtual Assistant (developed in collaboration with Google).
Instead, they literally wrote:
“(…) To fully leverage the potential of neuromorphic processes, specialised hardware architectures that efficiently mimic biologically inspired systems are required (…) we’re currently looking into Neuromorphic Computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years.”
They are evidently exploring full scale integration to maximise the benefits of energy efficiency, latency and privacy. The voice control implementation of Akida in the Vision EQXX was their initial proof-of-concept to demonstrate feasibility of NC in general (cf. the podcast with Steven Peters, MB’s former Head of AI Research from 2016-2022: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-407798). Whether they’ll eventually partner with us or a competitor (provided they are happy with their research project’s results) remains to be seen.
So I certainly do not expect the soon-to-be revealed CLA 2025 with the all-new MB.OS to have “Akida inside”, although I’d be more than happy to be proven wrong, as we’d all love to see the BRN share price soar on these news…
Time - and the financials - will ultimately tell.
Hopefully, researchers in Jörg Conradt’s Neuro Computing Systems lab that moved from Munich (TUM) to Stockholm (KTH), will give Akida another chance one of these days (after the not overly glorious assessment of two KTH Master students in their degree project Neuromorphic Medical Image Analysis at the Edge, which was shared here before: https://www.diva-portal.org/smash/get/diva2:1779206/FULLTEXT01.pdf), trusting the positive feedback by two more advanced researchers Jörg Conradt knows well, who have (resp soon will have) first-hand-experience with AKD1000:
When he was still at TUM, Jörg Conradt was the PhD supervisor of Cristian Axenie (now head of SPICES lab at TH Nürnberg, whose team came runner-up in the 2023 tinyML Pedestrian Detection Hackathon utilising Akida) and co-authored a number of papers with him, and now at Stockholm, he is the PhD supervisor of Jens Egholm Pedersen, who is one of the co-organisers of the topic area Neuromorphic systems for space applications at the upcoming Telluride Neuromorphic Workshop, that will provide participants with neuromorphic hardware, including Akida. (I’d venture a guess that the name Jens on the slide refers to him).
The headline says top 10 edge AI chips. The article continues……Are we not an IP company? Those companies produce chips. Maybe that’s why we’re not on the list![]()
Hi jtardif,I fail to see why MB would choose Akida for a successful POC with the EQXX and then move on to test other NPs afterwards? We know that they had Loihi in their hands before Akida was chosen for the EQXX anyway. It’s just would be illogical for them to take backward steps ats, particularly as has been pointed out independently by the space industry and others in their testing that Akida is the first choice simply because it’s the only truly commercially available and fully supported NP currently on the market. AIMO.
Tony Lewis posted that DeepSeek is a boom for us.
Made a lot of points.
The post has been posted earlier here.
Can't remember details but it was very positive.