A video going along with that paper was uploaded to YouTube yesterday:
Both paper and video relate to another paper and video published by the same Uni Tübingen authors earlier this year. At a cursory glance, at least the videos (posted about six months apart) appear to be VERY similar:
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900
View attachment 70372
View attachment 70373
Now compare the slides to those in the video uploaded October 3:
View attachment 70368
View attachment 70369
View attachment 70370
In fact, when I just tried to cursorily compare the new paper to the March 15 paper that @Fullmoonfever had linked at the time (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416313), I discovered that the link he had posted then now connects directly to this new paper, published on September 16, so it seems to be an updated version of the previous paper.
I did notice the addition of another co-author, though: Sebastian Otte, who used to be a PhD student and postdoc at Uni Tübingen (2013-2023) and became Professor at Uni Lübeck’s Institute for Robotics and Cognitive Systems just over a year ago, where he heads the Adaptive AI research group.
![]()
To put the results that our competitors’ neuromorphic offerings fared worse in the benchmarking tests alongside Akida somewhat into perspective:
In all fairness, it should be highlighted that Akida’s superiority was at least partly due to the fact that AKD1000 is available as a PCIe Board, whereas SynSense’s DynapCNN was connected to the PC via USB and - as the excerpt Gazzafish already posted shows - the researchers did not have direct access to a Loihi 2 edge device, but merely through a virtual machine provided by Intel via their Neuromorphic Research Cloud. The benchmarking would obviously yield better comparable results if the actual hardware used were of a similar form factor:
“Our results show that the better a neuromorphic edge device is connected to the main compute unit, e.g., as a PCIe card, the better the overall run-time.”
Anyway, Akida undoubtedly impressed the researchers, and as a result they are considering further experiments: “(…) future work could involve evaluating the system with an additional Akida PCIe card.”
View attachment 70374
In an earlier post (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-426404), I had already mentioned that the paper’s first author, Andreas Ziegler, who is doing a PhD in robotics and computer vision at Uni Tübingen, has meanwhile completed his internship at Sony AI in Switzerland (that - as we know - partially funded the paper’s research):
View attachment 70375
Fun fact: One of his co-authors, Karl Vetter, however, is no longer with Uni Tübingen’s Cognitive Systems Lab, but has since moved to France, where he has been working as a research engineer for…
![]()
![]()
Neurobus for the past three months!
It’s a small world, isn’t it?!
View attachment 70376
View attachment 70377
Maybe he wasn’t taking about deals but dealsAnd another week goes by without a contract announcement.
2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.
Tick, tock.
Hi @Bravo,
the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.
View attachment 71758
However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html
In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).
It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.
Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?
Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?
I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website?
Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?
View attachment 71814
View attachment 71815
And how about the other points I raised in previous posts, such as
- the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition
- the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?
- the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
- View attachment 71816
- the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?
View attachment 71817
View attachment 71818
View attachment 71824
- the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71822
- the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
(https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…
Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.
Interested to hear your or anyone else’s thoughts on those points.
Hi @Bravo,
the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.
View attachment 71758
However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html
In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).
It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.
Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?
Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?
I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website?
Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?
View attachment 71814
View attachment 71815
And how about the other points I raised in previous posts, such as
- the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition
- the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?
- the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
- View attachment 71816
- the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?
View attachment 71817
View attachment 71818
View attachment 71824
- the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71822
- the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
(https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…
Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.
Interested to hear your or anyone else’s thoughts on those points.
Probably been posted before. Explains PICO quite well.![]()
Akida Pico: The Game-Changing Ultra-Low Power AI Co-Processor
BrainChip, a pioneer in ultra-low-power, brain-inspired AI technology, has once again pushed the boundaries of what is possible in AI co-processing withwww.geeky-gadgets.com
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\
Sort of explains the ARM/Qualcomm dispute pretty well.![]()
Arm is giving Qualcomm the wake-up call it needs
Qualcomm needs Arm more than Arm needs Qualcomm.www.androidcentral.com
There has been a fair bit of media chat about Qualcomm using RISC-V of late.
A combination of RISC-V and PICO could prove to be a powerful and cost effective solution. ARM however does have a reliable and strong eco support system already in place.
Will we see a move away from ARM for mobiles and Wearables?
May well be but then Qualcomm will need to cave in and bow to ARM's demands. It will be interesting to see how this pans out?I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.
If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.
*RISC-V and ARM use different Instruction Set Architectures (ISAs).
A quote from the article in my previous post.May well be but then Qualcomm will need to cave in and bow to ARM's demands. It will be interesting to see how this pans out?
Is Qualcomm bluffing? RISC-V is free and ARM is trying to screw Qualcomm.
See link to article outlining Qualcomm's plans titled
"Qualcomm VP discusses its 'next' chip for Wear OS watches"
" Qualcomm and Google are "working on it" right now, and I'm assuming 2025 is the target to have RISC-V software optimized."
![]()
Qualcomm VP discusses its 'next' chip for Wear OS watches
The next Snapdragon Wear chipset should be a "feature-focused" SoC with custom RISC-V or Oryon cores, AI tech, and a PC-like approach in 2025.www.androidcentral.com
A quote from the article in my previous post.
My bold:
" In 2023, Qualcomm and Google announced that they're co-developing an open-source RISC-V Snapdragon Wear platform that moves away from Arm cores for more efficient, custom-built CPUs. It sounded promising, but a year later, Qualcomm spent its Summit focused on its Snapdragon 8 Elite with custom Oryon cores, barely mentioning wearables.
Yep, Qualcomm's strategy would have to similar to BRN's who talk to their client's customers. Tony (chairman) outlined this strategy briefly at the AGM.They're not bluffing and I would do the same, kind of kicking out the "man-in-the-middle" and reduce production costs (get rid of licence fees) and consumer product costs.
But Qualcomm needs / have to convince customer who will go the same way. In my opinion this takes time (1-3 years). Customer who already play around with alternatives may adopt a new processor architecture within 1-2 years. One major point is reducing costs. The bitter part is ... new development, tests etc..
The good part is, Akida plays well with RISC-V and ARM.
I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.
If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.
*RISC-V and ARM use different Instruction Set Architectures (ISAs).
So why are you here on this forum? With all of the answers you have given above. What is it that you think BrainChip can bring to the table BrainShit?
Great name by the way??!!