BRN Discussion Ongoing

7für7

Top 20
I'm 100% in agreement with you @Tothemoon24!

I'm quite confident our technology will be included in the “Hey Mercedes” voice control system in the electric vehicles anticipated to enter production in 2026, tying in with the first models expected to feature the new Mercedes MB.OS

Some people think that it would be impossible for our technology to be included because we haven't achieved the relevant safety standards such as ISO 26262. However when you listen to the video that you linked to, Magnus Östberg talks about the autonomous driving system being mission critical with the brakes and drive-train requiring the highest level of safety standards.

Importantly, at the 2.30 minute mark Magnus states "Then we have areas which are less critical. For example we have voice assistant."

This demonstrates to me that there would be less hoops to jump through as a result, which is why I remain so confident.
I think similarly. Many people focus too much on safety-related topics that are crucial for autonomous driving. However, there are already functioning systems for that. Sure, Akida could optimize those, but it would take longer because it involves a high risk for the passengers.

I believe the topic of interior experience and communication with the car will become a significant focus. This not only enhances comfort but also makes the driving or travel experience more interesting when you can communicate more naturally with the vehicle or even have entire conversations with it—like K.I.T.T. It might sound funny, but that’s how it will be in the end.

On the other hand, the car could notify you of issues like, “I think it’s time to change my oil,” or “It seems like a tire has lost some air pressure,” or manage settings for climate control, reminders, preheating the interior, or even upcoming appointments. The possibilities are endless.

This area alone would be enough to make Akida highly useful.

I don’t think it will only be integrated in Mercedes. Sooner or later it will get standard in every car
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 10 users

Guzzi62

Regular
Found this on the other place.

Interesting take!

 
  • Like
  • Fire
Reactions: 2 users

Frangipani

Top 20
The latest Brains and Machines podcast just came out, interviewing Australian roboticist Rodney Brooks, who is an MIT Emeritus Professor and successful robotics entrepreneur.



The podcast’s next episode will FINALLY be about BrainChip! 🥳

2DABAAAF-8160-4C50-8245-794470D59D2C.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 68 users

Derby1990

Regular
Doing some Saturday dreaming. Can we get back here again next week? I'd like to see that.
Got RSI hitting that refresh button on that day, waiting to see that little number... 1



3.JPG
 
  • Like
  • Love
  • Fire
Reactions: 26 users

manny100

Top 20
I'm 100% in agreement with you @Tothemoon24!

I'm quite confident our technology will be included in the “Hey Mercedes” voice control system in the electric vehicles anticipated to enter production in 2026, tying in with the first models expected to feature the new Mercedes MB.OS

Some people think that it would be impossible for our technology to be included because we haven't achieved the relevant safety standards such as ISO 26262. However when you listen to the video that you linked to, Magnus Östberg talks about the autonomous driving system being mission critical with the brakes and drive-train requiring the highest level of safety standards.

Importantly, at the 2.30 minute mark Magnus states "Then we have areas which are less critical. For example we have voice assistant."

This demonstrates to me that there would be less hoops to jump through as a result, which is why I remain so confident.
Agree, BRN recently confirmed that we still retain a commercial relationship with Mercedes and the other marquee clients listed under their 'Why Invest'.
I think that Mercedes as the owner/builder of its operating system is responsible for obtaining ISO approvals for its operating system - that is what i have read. I do not believe we have to receive ISO seperately for our chips.
BRN just supplied the chips that form part of the system. The system itself is rigorously tested by Mercedes before getting approvals.
All AKIDA1000 chips are the same whether they are for auto, space or health etc.
When for example if Tata use our chips for a hand held medical instruments they will have to get the finished product tested and approved by Health authorities. We will not need to get our chips separately tested and approved.
So IMO we are well in the race for all Mercedes chip requirements.
The choice of Frontgrade for Space and USAF for defense testing will make Mercedes look 2nd rate if they do not use us.
 
  • Like
  • Fire
  • Love
Reactions: 40 users
Doing some Saturday dreaming. Can we get back here again next week? I'd like to see that.
Got RSI hitting that refresh button on that day, waiting to see that little number... 1



View attachment 75236
Not without a significant IP deal, or partnership with a FAANG type Company.

Otherwise, dreaming is all it is.

There was still big volume Friday, but quite a bit lower than the previous sessions.

If the whales were still there, they weren't as ravenous..
 
  • Like
Reactions: 8 users

hotty4040

Regular
Found this on the other place.

Interesting take!




Very interesting listen Guzzi62: Enter, the "neuromorphic" solution to these dilemmas possibly/probably !!!???
Surprised this guy didn't mention/allude to this possibility in his meandering thoughts.
Nice pickup from the "dark side" IMO......


Hotty...
 
  • Like
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
The latest Brains and Machines podcast just came out, interviewing Australian roboticist Rodney Brooks, who is an MIT Emeritus Professor and successful robotics entrepreneur.



The podcast’s next episode will FINALLY be about BrainChip! 🥳

View attachment 75235
Afternoon Frangipani ,

Good find.

I got halfway through the interview, very interesting , then at about the 26 min mark , my phone went on random scroll and would not stop , THERE MAY BE A BUG ATTACHED TO THIS FILE.

Thankyou once again and very much looking foward to their next episode.

Regards,
Esq.
 
  • Like
  • Wow
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Did someone say Cerence?

I just said Cerence.

Note to self: Don't get me started on Cerence! 😝


Some background info:

BrainChip's Akida technology, as utilized in the Mercedes-Benz Vision EQXX, employs neuromorphic computing to improve the efficiency of voice control systems.

Cerence has been a key provider of voice and AI-driven features in Mercedes-Benz's MBUX system although it wasn't directly incorporated into the Vision EQXX's voice assistant.

Mercedes-Benz has been collaborating with NVIDIA to integrate advanced computing platforms into their vehicles. Currently, Mercedes-Benz utilizes NVIDIA's DRIVE Orin system-on-a-chip (SoC) to power its autonomous systems.

The announcement below describes an expanded partnership between Cerence and NVIDIA and that The "integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. Cerence's CaLLM technology is designed to enhance in-car voice assistants by providing intelligent, natural, and personalized interactions between humans and their vehicles.

It would make sense IMO to integrate Cerence's CaLLM technology with BrainChip's Akida to potentially combine the strengths of both systems; to offer advanced AI-driven voice interactions alongside energy-efficient processing. This could lead to more responsive and efficient in-car voice assistants. Here's hoping this could be on the cards in the near future. Al parties would have to be aware of one another.



Cerence Soars 120% on Expanded Nvidia Partnership​

January 3, 2025 Shira Astmann

Nvidia

Cerence Inc. (CRNC) saw its stock price surge by over 120% on Friday following the announcement of an expanded partnership with Nvidia (NVDA). This collaboration aims to enhance the capabilities of Cerence’s CaLLM™ family of language models, specifically bolstering both its cloud-based Cerence Automotive Large Language Model (CaLLM) and the CaLLM Edge, which is an embedded small language model.
The integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. This partnership marks a significant step towards creating more sophisticated in-car AI assistants that can interact seamlessly through both cloud and embedded systems, requiring a blend of hardware, software, and user experience (UX) expertise.
Cerence stated that through close collaboration with Nvidia’s engineers, it has been able to significantly accelerate the development and deployment of its AI technologies. The use of Nvidia’s TensorRT-LLM and NeMo frameworks has been instrumental. TensorRT-LLM optimizes large language models for inference on Nvidia GPUs, while NeMo provides an end-to-end platform for building, customizing, and deploying AI models. This synergy has allowed Cerence to:
Boost the performance of in-vehicle assistants by utilizing Nvidia’s accelerated computing solutions and system-on-chips (SoCs). The result is a faster, more responsive interaction within vehicles, enhancing the driving experience.
Develop specialized guardrails for in-car AI using Nvidia NeMo Guardrails. This ensures that Cerence’s AI systems can handle the unique conversational and safety requirements of automotive environments, navigating the complexities of human interaction in a moving vehicle.
Implement an agentic architecture on the CaLLM Edge using Nvidia DRIVE AGX Orin. This approach not only optimizes performance but also paves the way for future advancements in vehicle user interfaces, offering a more personalized and intuitive interaction model.
This strategic alliance with Nvidia provides Cerence with the tools and infrastructure necessary to support its automotive partners in delivering cutting-edge user experiences. The focus is on creating systems that offer not just performance but also privacy, security, and resilience against malicious interactions, addressing key consumer and manufacturer concerns in the connected car era.
The market’s enthusiastic response to the announcement underscores the potential seen in Cerence’s ability to lead in the automotive AI space, leveraging Nvidia’s technology to push the boundaries of what’s possible in vehicle intelligence. This move positions Cerence to further solidify its role in shaping the future of in-car AI, where the emphasis is increasingly on seamless, secure, and user-friendly technology integration.

 
Last edited:
  • Like
  • Thinking
  • Love
Reactions: 38 users

TECH

Regular
Did someone say "Big Kev"....I'm excited, remember how I and others have highlighted Accenture's Patents
throughout the year, including one podcast with an Accenture engineer (from memory)...didn't he say he loved Akida or
words to that effect....hope their customers are tuned in.

At least 3 Patents naming Akida in the artwork (from memory)....check this little bit of positivity, not naming us as such, but
presenting at CES 2025....CES is huge.



Listen to the video attached...what this guy says, relates to our company and all companies attending and presenting their wares
at CES 2025.

Tech x
 
  • Like
  • Fire
  • Love
Reactions: 41 users
  • Like
Reactions: 4 users

Diogenese

Top 20
Did someone say Cerence?

I just said Cerence.

Note to self: Don't get me started on Cerence! 😝


Some background info:

BrainChip's Akida technology, as utilized in the Mercedes-Benz Vision EQXX, employs neuromorphic computing to improve the efficiency of voice control systems.

Cerence has been a key provider of voice and AI-driven features in Mercedes-Benz's MBUX system although it wasn't directly incorporated into the Vision EQXX's voice assistant.

Mercedes-Benz has been collaborating with NVIDIA to integrate advanced computing platforms into their vehicles. Currently, Mercedes-Benz utilizes NVIDIA's DRIVE Orin system-on-a-chip (SoC) to power its autonomous systems.

The announcement below describes an expanded partnership between Cerence and NVIDIA and that The "integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. Cerence's CaLLM technology is designed to enhance in-car voice assistants by providing intelligent, natural, and personalized interactions between humans and their vehicles.

It would make sense IMO to integrate Cerence's CaLLM technology with BrainChip's Akida to potentially combine the strengths of both systems; to offer advanced AI-driven voice interactions alongside energy-efficient processing. This could lead to more responsive and efficient in-car voice assistants. Here's hoping this could be on the cards in the near future. Al parties would have to be aware of one another.



Cerence Soars 120% on Expanded Nvidia Partnership​

January 3, 2025 Shira Astmann

Nvidia

Cerence Inc. (CRNC) saw its stock price surge by over 120% on Friday following the announcement of an expanded partnership with Nvidia (NVDA). This collaboration aims to enhance the capabilities of Cerence’s CaLLM™ family of language models, specifically bolstering both its cloud-based Cerence Automotive Large Language Model (CaLLM) and the CaLLM Edge, which is an embedded small language model.
The integration involves leveraging Nvidia’s AI Enterprise platform, a comprehensive software solution for cloud-native AI applications, alongside Nvidia DRIVE AGX Orin for specific functionalities of CaLLM Edge. This partnership marks a significant step towards creating more sophisticated in-car AI assistants that can interact seamlessly through both cloud and embedded systems, requiring a blend of hardware, software, and user experience (UX) expertise.
Cerence stated that through close collaboration with Nvidia’s engineers, it has been able to significantly accelerate the development and deployment of its AI technologies. The use of Nvidia’s TensorRT-LLM and NeMo frameworks has been instrumental. TensorRT-LLM optimizes large language models for inference on Nvidia GPUs, while NeMo provides an end-to-end platform for building, customizing, and deploying AI models. This synergy has allowed Cerence to:
Boost the performance of in-vehicle assistants by utilizing Nvidia’s accelerated computing solutions and system-on-chips (SoCs). The result is a faster, more responsive interaction within vehicles, enhancing the driving experience.
Develop specialized guardrails for in-car AI using Nvidia NeMo Guardrails. This ensures that Cerence’s AI systems can handle the unique conversational and safety requirements of automotive environments, navigating the complexities of human interaction in a moving vehicle.
Implement an agentic architecture on the CaLLM Edge using Nvidia DRIVE AGX Orin. This approach not only optimizes performance but also paves the way for future advancements in vehicle user interfaces, offering a more personalized and intuitive interaction model.
This strategic alliance with Nvidia provides Cerence with the tools and infrastructure necessary to support its automotive partners in delivering cutting-edge user experiences. The focus is on creating systems that offer not just performance but also privacy, security, and resilience against malicious interactions, addressing key consumer and manufacturer concerns in the connected car era.
The market’s enthusiastic response to the announcement underscores the potential seen in Cerence’s ability to lead in the automotive AI space, leveraging Nvidia’s technology to push the boundaries of what’s possible in vehicle intelligence. This move positions Cerence to further solidify its role in shaping the future of in-car AI, where the emphasis is increasingly on seamless, secure, and user-friendly technology integration.

Hi Bravo,

Cerence could certainly benefit from a sprinkle of Akida.

According to these patents, they use cloud-based software for key word spotting:

US2022358924A1 METHODS AND APPARATUS FOR DETECTING A VOICE COMMAND 20130312 – 20220718

1735975102586.png






US11676600B2

1735975153480.png


We could get our mates at GMAC to ask "Do you want Akida with that?"
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 24 users

Terroni2105

Founding Member
  • Like
Reactions: 6 users

Boab

I wish I could paint like Vincent
  • Haha
  • Like
Reactions: 6 users

Diogenese

Top 20
Brainchip CES 2025 page features LLMs + RAG, Anomoly Detectiob. and Building ML Models with Edge Impulse.

Brainchip CES 2025

https://brainchip.com/ces-2025/

LLMs + RAG Demo

See how we’re advancing large language models (LLMs) with Retrieval-Augmented Generation (RAG) for smarter, real-time AI applications.

Anomaly Detection Demo

Explore our latest anomaly detection solution running on Raspberry Pi 5. This versatile demo targets multiple verticals, including Industrial IoT, manufacturing, healthcare (wearable devices), cybersecurity, fraud detection, and more.

Building ML Models with Edge Impulse

Explore hands-on demos with Edge Impulse, demonstrating how easy it is to build and deploy custom machine learning models directly on the Akida platform.


The Edge Impulse partnership is essential for rapid deployment of Akida. NNs require appropriate task-specific models to perform specific tasks. Before Edge Impulse, the models were hand made, a time-consuming and costly task. With EI, models can be assembled automatically using customer data and general models in the same field as the customer's specific tasks.

Then, of course, the models need to be converted to Akida-ese (the spike coding format required for Akida models).

LLMs and RAG, as far as I understand it, means dividing a large universal LLM into subject-specific blocks, and selecting the block or blocks required for the task in hand and using the selected block(s) in configuring Akida. Again, for this to work with Akida the blocks need to be in Akida-usable format, so I would suspect the presence of Edge Impulse at some stage of the process.


https://edgeimpulse.com/all-events/edge-impulse-at-ces
The Edge Impulse team is excited to head back to Vegas in January for another exciting CES!
Our team will be exhibiting with our partners BrainChip, Microchip and CEVA, in addition to walking the floors and meeting with prospects and customers
.
 
  • Like
  • Fire
  • Love
Reactions: 39 users

Attachments

  • IMG_5758.png
    IMG_5758.png
    249.7 KB · Views: 27
  • Like
  • Haha
Reactions: 4 users

Boab

I wish I could paint like Vincent
  • Haha
  • Love
Reactions: 14 users

(This post is unrelated to the link)
20250104_200218.jpg


Was watching a humanoid robotics commentary and Jim Fan's name came up, with the above LinkedIn post.

He seems to be one of Nvidia's Top Dogs, in humanoid robotics research.

The implications are just Huge, if he has anything at all, to do with looking at and working with our technology (that's obviously despite the fact, that it's huge anyway, if "anyone" from Nvidia is working with us).




Edit.. And yeah.. Obviously I didn't "read" the short article about "Nvidia GEAR" in your post Doz 🙄..
 
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 9 users

FuzM

Member
Came across this in Dec 2023. I cant seem to find the link to it but it came from brn's website. Found it intriguing that the architecture included NVDLA (Nvidia Accelerator) along with NV Encoder (Nvidia encoder) and NV Jpeg (Nvidia encoding, decoding and transcoding of JPEG images).

Not sure what to make out of it. Maybe someone here can help to make sense on why we would need NVDLA, NV Encoder and NV jpeg along with Akida across the Memory controller fabric unless we are aiming for hybrid architecture to offload certain computation between CPU, Akida and Nvidia Accelerator.

Brn NVDLA.jpg
 
  • Like
  • Thinking
  • Fire
Reactions: 16 users
Top Bottom