BRN Discussion Ongoing

FiveBucks

Regular
And another week goes by without a contract announcement.

2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.

Tick, tock.
 
  • Like
  • Sad
  • Thinking
Reactions: 14 users

Frangipani

Top 20
A video going along with that paper was uploaded to YouTube yesterday:



Both paper and video relate to another paper and video published by the same Uni Tübingen authors earlier this year. At a cursory glance, at least the videos (posted about six months apart) appear to be VERY similar:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900


View attachment 70372
View attachment 70373


Now compare the slides to those in the video uploaded October 3:

View attachment 70368


View attachment 70369

View attachment 70370

In fact, when I just tried to cursorily compare the new paper to the March 15 paper that @Fullmoonfever had linked at the time (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416313), I discovered that the link he had posted then now connects directly to this new paper, published on September 16, so it seems to be an updated version of the previous paper.

I did notice the addition of another co-author, though: Sebastian Otte, who used to be a PhD student and postdoc at Uni Tübingen (2013-2023) and became Professor at Uni Lübeck’s Institute for Robotics and Cognitive Systems just over a year ago, where he heads the Adaptive AI research group.

0d00f748-f1ff-44f9-be7c-849d5e0b8583-jpeg.70378



To put the results that our competitors’ neuromorphic offerings fared worse in the benchmarking tests alongside Akida somewhat into perspective:
In all fairness, it should be highlighted that Akida’s superiority was at least partly due to the fact that AKD1000 is available as a PCIe Board, whereas SynSense’s DynapCNN was connected to the PC via USB and - as the excerpt Gazzafish already posted shows - the researchers did not have direct access to a Loihi 2 edge device, but merely through a virtual machine provided by Intel via their Neuromorphic Research Cloud. The benchmarking would obviously yield better comparable results if the actual hardware used were of a similar form factor:

“Our results show that the better a neuromorphic edge device is connected to the main compute unit, e.g., as a PCIe card, the better the overall run-time.”


Anyway, Akida undoubtedly impressed the researchers, and as a result they are considering further experiments: “(…) future work could involve evaluating the system with an additional Akida PCIe card.”


View attachment 70374


In an earlier post (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-426404), I had already mentioned that the paper’s first author, Andreas Ziegler, who is doing a PhD in robotics and computer vision at Uni Tübingen, has meanwhile completed his internship at Sony AI in Switzerland (that - as we know - partially funded the paper’s research):

View attachment 70375


Fun fact: One of his co-authors, Karl Vetter, however, is no longer with Uni Tübingen’s Cognitive Systems Lab, but has since moved to France, where he has been working as a research engineer for…

🥁 🥁 🥁 Neurobus for the past three months!
It’s a small world, isn’t it?! 😉

View attachment 70376
View attachment 70377


Three days ago, first author Andreas Ziegler gave a talk on the recent table tennis robot research conducted at Uni Tübingen 👆🏻 during the Neuromorphic Vision Hackathon at ZHAW (Zürcher Hochschule für Angewandte Wissenschaften / Zurich University of Applied Sciences), where robotics and neuromorphic computing expert Yulia Sandamirskaya (ex Intel Labs) heads the Research Centre “Cognitive Computing in Life Sciences” at ZHAW’s Wädenswil campus.

56D0AC50-F2AB-4F1E-9676-49F80A3BB562.jpeg



While the content of his presentation is not new for those of you who already read the paper or saw the video, I thought the way he presented it was quite cool, with all the embedded videos! Have a look yourselves:


Anyway, more exposure for Akida and those favourable benchmarking results (even though it is unclear how much influence the hardware’s form factor had, see my post above).



Here are some of the presentation slides:

D97A42A0-D3DF-43BF-AF67-702C59211B0B.jpeg


79EE8ADC-0203-489E-909B-9E1F7F644F51.jpeg

8288B92E-4FC0-4AD9-AD66-931543D93CDA.jpeg



A5E76D5C-17FF-4C68-A1DD-D2335E8D73BA.jpeg


21A7DD72-185F-4EE5-8F74-BFA1E8E92ABB.jpeg


C2D2DF1C-4131-4D19-B448-EF6F0F244563.jpeg


E308335D-3A07-44BE-BB8C-0224EEE8E101.jpeg


In Andreas Ziegler’s updated CV (https://andreasaziegler.github.io/), we can now see who his supervisors were during his internship at Sony AI (that funded this research): Raphaela Kreiser and Nagoya Takahashi:

0DF9B55B-E44D-4587-BAED-8B9F65678FEA.jpeg



B01F373E-1189-4592-B100-7A09796B48C3.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2024-10-26 at 8.56.51 am.png
 
  • Like
  • Love
  • Thinking
Reactions: 31 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Thinking
Reactions: 26 users

manny100

Top 20
Probably been posted before. Explains PICO quite well.
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\
 
  • Like
  • Love
  • Thinking
Reactions: 15 users

manny100

Top 20
Sort of explains the ARM/Qualcomm dispute pretty well.
There has been a fair bit of media chat about Qualcomm using RISC-V of late.
A combination of RISC-V and PICO could prove to be a powerful and cost effective solution. ARM however does have a reliable and strong eco support system already in place.
Will we see a move away from ARM for mobiles and Wearables?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 15 users

Guzzi62

Regular
AI is already in phones:

The AI that's already in your phone​

We’re slowly getting used to Artificial Intelligence doing uncannily human things - chatting with us, creating pictures and videos. But so far, all of this AI has used a lot of computing power.
In the last year or so, we’ve seen a new type of computer chip made specifically for AI, and your mobile phone. Tech reporter, Spencer Kelly has been testing some of the latest AI features available to us.

 
  • Like
  • Fire
  • Wow
Reactions: 6 users
  • Like
  • Fire
  • Love
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
In the past we have partnered with the likes of MagicEye on 3D object detection. I wonder if this announcement below involving our partner, Tata Exlsi, will open up a pathway to working with Forseight Autonomous Holdings on their 3D perception solutions?




Foresight and Tata Elxsi Sign Collaboration Agreement​

October 25, 2024 08:21 ET| Source: Foresight Autonomous Holdings Ltd.Follow


  • The parties will collaborate to accelerate development of solutions for semi-autonomous and autonomous vehicles using Foresight’s stereoscopic technology and Tata Elxsi’s integration solutions for marketing in the Indian automotive industry
Ness Ziona, Israel, Oct. 25, 2024 (GLOBE NEWSWIRE) -- Foresight Autonomous Holdings Ltd. (Nasdaq and TASE: FRSX) (“Foresight” or the “Company”), an innovator in automotive vision systems, announced today the signing of multi-phase collaboration agreement with Tata Elxsi Limited (“Tata Elxsi”), a leading global tier-one supplier of design and technology services, providing solutions across various industries, including the automotive, broadcast, communications, healthcare, and transportation industries.
The initial phase will include the development and commercialization of advanced solutions for advanced driver assistance systems (ADAS). These will be integrated into passenger vehicles, heavy machinery and agricultural vehicles manufactured by Tata Motors. Building on the success of the ADAS implementations, the parties will continue to develop and commercialize advanced services for semi and fully autonomous features to be integrated into various applications within the automotive industry.
Tata Elxsi will introduce and promote Foresight’s 3D perception solutions to its diverse customer base, starting with the Indian automotive industry, and subsequently targeting global automotive vehicle manufacturers. Furthermore, during the first half of 2025, Tata Elxsi plans to promote Foresight’s solutions to its existing customers in the heavy machinery and agriculture sectors.
Foresight’s 3D perception solutions are based on stereoscopic technology, using both visible-light and thermal infrared cameras, and proprietary algorithms to detect all objects and create high resolution 3D point clouds.
“We are excited to collaborate with Tata Elxsi to bring our advanced 3D perception technology to the Indian automotive industry. We believe that this collaboration will help us expand our footprint in the emerging Indian market, including in autonomous passenger vehicles, heavy machinery and agricultural equipment, thereby leading to safer and more efficient transportation options across India,” said Oren Bar-On, Chief Executive Officer of Foresight Asia.

 
  • Like
  • Love
  • Fire
Reactions: 20 users
And another week goes by without a contract announcement.

2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.

Tick, tock.
Maybe he wasn’t taking about deals but deals

1729921998369.gif
 
  • Haha
  • Like
Reactions: 9 users

BrainShit

Regular
Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Fully agreed... Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.
 
  • Like
Reactions: 1 users

BrainShit

Regular
Probably been posted before. Explains PICO quite well.
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\

Sort of explains the ARM/Qualcomm dispute pretty well.
There has been a fair bit of media chat about Qualcomm using RISC-V of late.
A combination of RISC-V and PICO could prove to be a powerful and cost effective solution. ARM however does have a reliable and strong eco support system already in place.
Will we see a move away from ARM for mobiles and Wearables?

I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.

If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.

*RISC-V and ARM use different Instruction Set Architectures (ISAs).
 
  • Like
  • Thinking
Reactions: 7 users

manny100

Top 20
I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.

If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.

*RISC-V and ARM use different Instruction Set Architectures (ISAs).
May well be but then Qualcomm will need to cave in and bow to ARM's demands. It will be interesting to see how this pans out?
Is Qualcomm bluffing? RISC-V is free and ARM is trying to screw Qualcomm.
See link to article outlining Qualcomm's plans titled

"Qualcomm VP discusses its 'next' chip for Wear OS watches"​

" Qualcomm and Google are "working on it" right now, and I'm assuming 2025 is the target to have RISC-V software optimized."
 
  • Like
  • Fire
  • Love
Reactions: 8 users

manny100

Top 20
May well be but then Qualcomm will need to cave in and bow to ARM's demands. It will be interesting to see how this pans out?
Is Qualcomm bluffing? RISC-V is free and ARM is trying to screw Qualcomm.
See link to article outlining Qualcomm's plans titled

"Qualcomm VP discusses its 'next' chip for Wear OS watches"​

" Qualcomm and Google are "working on it" right now, and I'm assuming 2025 is the target to have RISC-V software optimized."
A quote from the article in my previous post.
My bold:
" In 2023, Qualcomm and Google announced that they're co-developing an open-source RISC-V Snapdragon Wear platform that moves away from Arm cores for more efficient, custom-built CPUs. It sounded promising, but a year later, Qualcomm spent its Summit focused on its Snapdragon 8 Elite with custom Oryon cores, barely mentioning wearables.
 
  • Like
  • Fire
Reactions: 3 users

BrainShit

Regular
The brochure of Akida Pico.
 

Attachments

  • BC_Akida-Pico-Brochure.pdf
    272.6 KB · Views: 154
  • Fire
  • Like
Reactions: 8 users

BrainShit

Regular
A quote from the article in my previous post.
My bold:
" In 2023, Qualcomm and Google announced that they're co-developing an open-source RISC-V Snapdragon Wear platform that moves away from Arm cores for more efficient, custom-built CPUs. It sounded promising, but a year later, Qualcomm spent its Summit focused on its Snapdragon 8 Elite with custom Oryon cores, barely mentioning wearables.

They're not bluffing and I would do the same, kind of kicking out the "man-in-the-middle" and reduce production costs (get rid of licence fees) and consumer product costs.

But Qualcomm needs / have to convince customer who will go the same way. In my opinion this takes time (1-3 years). Customer who already play around with alternatives may adopt a new processor architecture within 1-2 years. One major point is reducing costs. The bitter part is ... new development, tests etc..

The good part is, Akida plays well with RISC-V and ARM.
 
  • Like
  • Fire
  • Love
Reactions: 10 users

manny100

Top 20
They're not bluffing and I would do the same, kind of kicking out the "man-in-the-middle" and reduce production costs (get rid of licence fees) and consumer product costs.

But Qualcomm needs / have to convince customer who will go the same way. In my opinion this takes time (1-3 years). Customer who already play around with alternatives may adopt a new processor architecture within 1-2 years. One major point is reducing costs. The bitter part is ... new development, tests etc..

The good part is, Akida plays well with RISC-V and ARM.
Yep, Qualcomm's strategy would have to similar to BRN's who talk to their client's customers. Tony (chairman) outlined this strategy briefly at the AGM.
 
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.

If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.

*RISC-V and ARM use different Instruction Set Architectures (ISAs).

So why are you here on this forum? With all of the answers you have given above. What is it that you think BrainChip can bring to the table BrainShit?

Great name by the way??!!
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 16 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 24 users

BrainShit

Regular
So why are you here on this forum? With all of the answers you have given above. What is it that you think BrainChip can bring to the table BrainShit?

Great name by the way??!!

Ther're some advantages which BrainChip can provide. The question is, if they're so good and got a benefit that a customer will spend money, time and effort in / with it.

I'm at gym right now, short in discussions... u know biceps & brain.

I love the name... it wins either way 😁
 
  • Thinking
Reactions: 1 users
Top Bottom