BRN Discussion Ongoing

Esq.111

Fascinatingly Intuitive.
Evening Chippers,

A little impatient, dared holidays, no market stimuli.

Going back through a little paperwork from 2021 and came upon this which I printed at the time.

All three photos are on INTEL , our latest partner to facilitate the world domination of Brainchip going foward in the neuromorphic sphere.

Worth having a good look at
1..their starting share price
2..The steady increase in their DIVIDEND through the passage of time. As you can see this data dose not capture the last couple of years as it was printed over one year ago. Dare say their dividends reduced or stopped in the last two years, but one can quickly get the gist of how holding a stock for an extended period of time ( if in a fortunate position to do so ) can repay the initial capital investment many fold ,
ANNUALLY.

Personally, I belive Brainchip's rise will be much faster, pressent global events taken into consideration.

Enjoy.

Regards,
Esq.

*AKA, one of Santa's little helpers.

😁 .
 

Attachments

  • 20221226_195219.jpg
    20221226_195219.jpg
    1.6 MB · Views: 194
  • 20221226_195122.jpg
    20221226_195122.jpg
    1.8 MB · Views: 197
  • 20221226_194930.jpg
    20221226_194930.jpg
    1.8 MB · Views: 174
  • 20221226_194845.jpg
    20221226_194845.jpg
    1.8 MB · Views: 204
  • Like
  • Fire
  • Love
Reactions: 36 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 7 users
  • Like
  • Fire
Reactions: 12 users

Esq.111

Fascinatingly Intuitive.
Chippers,

Yes , enjoying a rather good single malt pressently .

Just for sh*ts & GIGGLES......

Some time ago , looked into the qualities / specs of how one becomes a FORTUNE 500 Company...

Apparently , in 2013 , one of the criteria to clear was REVINUE of $5.43 Billion USD., which compounds at an average rate of 4.3% Per Ann.

Tiz quite the hurdle to clear, though from little things BIG things grow.

Below are the workings of 4.3% from 2015 through to 2023.

Bear in mind , since these calculators the USD to AUD have changed together with the world markets having a little breather, so the end number would be lower .

Even did the conversion for BRN down the bottom.

Regards,
Esq.
 

Attachments

  • 20221226_213546.jpg
    20221226_213546.jpg
    1.9 MB · Views: 168
  • Like
  • Love
  • Fire
Reactions: 31 users

equanimous

Norse clairvoyant shapeshifter goddess
Is that like the insanity defence, where the onus is on the defendant?
The Defendant stated Ignorance is Bliss your Honour.

Now that we have changed government we could see more endoresments, hopefully.

What is Satire? :)

 
  • Like
  • Haha
  • Love
Reactions: 16 users
S

Straw

Guest
Chippers,

Yes , enjoying a rather good single malt pressently .

Just for sh*ts & GIGGLES......

Some time ago , looked into the qualities / specs of how one becomes a FORTUNE 500 Company...

Apparently , in 2013 , one of the criteria to clear was REVINUE of $5.43 Billion USD., which compounds at an average rate of 4.3% Per Ann.

Tiz quite the hurdle to clear, though from little things BIG things grow.

Below are the workings of 4.3% from 2015 through to 2023.

Bear in mind , since these calculators the USD to AUD have changed together with the world markets having a little breather, so the end number would be lower .

Even did the conversion for BRN down the bottom.

Regards,
Esq.
That's some serious Whiskey you've inhaled.
I Iike it!
 
  • Like
  • Haha
  • Love
Reactions: 13 users

Esq.111

Fascinatingly Intuitive.
Evening Straw,

I find if I can keep the remaining few braincells in a state of suspended picklement, the resultant output is amplified, and one dose enjoy computing large numbers.

I find solace.

Regards,
Esq.
 
  • Like
  • Haha
Reactions: 18 users
Lidar AKA BRN's sweet spot.
 
  • Like
  • Fire
  • Love
Reactions: 16 users
S

Straw

Guest
Have no clue what Id do with unlimited funds. Maybe a complete redo of creation and the human condition to make things a great deal less isolating and generally challenging. Hmmmm..... the best intentions (to shape the world to my will) could never cause anyone any harm...no really.........you trust me don't you?
Maybe I can redeem all my karma points and reap what I've sown - a singles holiday for 45yo+ with a dozen other totally broken people.
So sick of being terrified of loss and failure
and (sorry but currently) @#$% work.
*This is where my Akida enabled companion robot sternly sends me off to bed and has my life sorted by morning.

😫
 
  • Like
  • Love
  • Haha
Reactions: 17 users

dippY22

Regular
This recent "poking" of and subsequent commenting on a Linkedin poster is not productive in my opinion. There is no upside because those involved are escalating a silly non event into a potential fight. Why? Because the people involved are males (I think) with ego's and what starts out with a innocent question, claim, or comment, regardless of its validity, soon becomes a challenge....not unlike a group of boys gathering together and saying, "yeah?" ..."wanna bet?" ...."my dad can beat your dad", and so on. In this instance, the linkedin poster (the MB engineer) took his ball and went home rather than engaging or playing with Mr. Chapman.

Clearly, to me at least, any backing and forthing between well intentioned Brainchip cheerleading stockholders and ANYONE whether on linkedin, twitter or whatever, has got to be a cringe worthy moment for Brainchip executives if they are aware of it. And I believe they are aware of Mr. Chapman, and I believe they are aware of the MB engineer so they are aware. Ergo,...a possible cringeworthy moment which our swamped management team does not need.

Whether Mr. Chapman and others are right is not the point. In fact, I agree with them. But it is the public facing venue that has me concerned.

What we all want will be accomplished in time if we just let our company do what it is doing and refrain from poking someone,...anyone, with an association to our (Brainchips) customers - real, implied, hoped for or speculated about. This is an open stock forum....I get it, but in my opinion the risk of poking people on linkedin, or on this site, so that they literally block you (!!!) could be huge.

No one here has any idea of who may be watching sites such as linkedin.

Personally, I think we as quasi ambassadors for Brainchip need to act responsibly and be on our best public facing behavior, whether someone says their GPU dad can beat your Akida dad with one hand tied behind his back. Feel chagrined, but turn the other cheek.

We are sooooo close to what we all want. There are many non believers of Brainchip or just not aware of our technology out there, but what do we really ever gain by saying, or implying, that "my dad can whup your dad" ? And though in this case we are right about our relationship with MB this one person apparently employed by MB threatened to block someone for pointing that out. Think about that FACT.

That's my opinion, ....and now I expect a few of you may want to whup on me, metaphorically speaking.
dippY

"
 
  • Like
  • Love
  • Fire
Reactions: 92 users
Just some thoughts on Socionext, timelines and relationships that may or may not (?) have been covered previously.

Musing where all the tentacles lead to, and everything does appear to be moving to a critical mass type stage for the upcoming year.

Late 2018 this joint project occurred.

Note a couple of the project partners, project brief and expected end date ;)

Socionext and Partners to Start NEDO-Sponsored Project on
Developing ʻEvolutionary, Low-Power AI Edge LSIʼ
Langen/Germany, 17. October, 2018 --- Socionext Inc., ArchiTek Corporation, and Toyota
Industries Corporation
have signed an agreement to start a research and development project on
ʻEvolutionary, Low-Power AI Edge LSIʼ.

The project is being sponsored by New Energy and Industrial Technology Development Organization (NEDO), a Japanese governmental organization promoting the development and introduction of new energy technologies. It is scheduled to conclude in March 2022 with the goal to commercialize technologies in autonomous driving, surveillance systems, drones, robots, AI powered home appliances and others.

The project consists of the following:

(1) Virtual Engine Architecture (ArchiTek Corporation)

To develop a new architecture that achieves a compact device, low power consumption and flexibility, all at the same time.

(2) Real-Time SLAM (Toyota Industries Corporation)

To establish real-time SLAM (Simultaneous Localization And Mapping) technology for self-driving machines.

(3) Quantification DNN (Socionext Inc.)

To address and solve low recognition rate problem with DNN quantization, required for high speed and low power AI processing.

(4) Edge Environment Optimization (Socionext Inc.)

To study a method to identify and optimize how to share functions between the cloud and the edge.

The project is scheduled to conclude in March 2022. Socionext aims to establish the new "AI edge solution platform" based on the outcome of the project and apply it to a wide range of applications for expanding the company’s business and global market outreach.



Expanding the Five Senses with Edge Computing and
Solving Social Problems

Every minute and second, a tremendous amount of information is sucked up from edge devices to the cloud.
However, such information is by no means being used effectively.
From daily routines such as driving to medical care and disaster sites,
people are stressed to be forced to make decisions from a huge number of options in every scene.

Therefore, innovation in edge computing is now required.
Without waiting for a few seconds to communicate with the cloud, respond to human needs.
Always be proactive and respond to the situation.

It moves at the moment when people, society, want something, or before they become aware of that desire.
What we are aiming for is technology that expands the five senses.
Technological innovation with a radius of 1 meter.

I want to see more, I want to know more, I want to feel more.
When the edge changes, the world you feel changes.



In mid 2019 this occurred:

BrainChip and Socionext Sign a Definitive Agreement to Develop the Akida™ Neuromorphic System-on-Chip


In Mar 2020 this occurred:

BrainChip and Socionext Provide a New Low-Power Artificial Intelligence Platform for AI Edge Applications
Socionext to offer its SynQuacerTM Multi-Core Processor with BrainChip’s AkidaTM SoC
BrainChip will provide training, technical and customer support
Companies will jointly identify target end markets and customers

Socionext also offers a high-efficiency, parallel multi-core processor SynQuacerTM SC2A11 as a server solution for various applications.

Socionext’s processor is available now and the two companies expect the Akida SoC engineering samples to be available in the third quarter of 2020.

In addition to integrating BrainChip’s AI technology in an SoC, system developers and OEMs may combine BrainChip’s proprietary Akida device and Socionext’s processor to create high-speed, high-density, low-power systems to perform image and video analysis, recognition and segmentation in surveillance systems, live-streaming and other video applications.



Also in Mar 2020:

Socionext Prototypes Low-Power AI Chip with Quantized Deep Neural Network Engine
Delivers Significant Expansion of Edge Computing Capabilities, Performance and Functionality

SANTA CLARA, Calif., March 17, 2020 ---Socionext Inc. has developed a prototype chip that incorporates newly-developed quantized Deep Neural Network (DNN) technology, enabling highly-advanced AI processing for small and low-power edge computing devices.

The prototype is a part of a research project on “Updatable and Low Power AI-Edge LSI Technology Development” commissioned by the New Energy and Industrial Technology Development Organization (NEDO) of Japan. The chip features a "quantized DNN engine" optimized for deep learning inference processing at high speeds with low power consumption.

Quantized DNN Engine
In their place, Socionext has developed a proprietary architecture based on "quantized DNN technology" for reducing the parameter and activation bits required for deep learning. The result is improved performance of AI processing along with lower power consumption. The architecture incorporates bit reduction including 1-bit (binary) and 2-bit (ternary) in addition to the conventional 8-bit, as well as the company’s original parameter compression technology, enabling a large amount of computation with fewer resources and significantly less amounts of data.

Deep Learning Software Development Environment
Socionext has also built a deep learning software development environment. Incorporating TensorFlow as the base framework, it allows developers to perform original, low-bit "quantization-aware training" or "post-training quantization". When used in combination with the new chip, users can choose and apply the optimal quantization technology to various neural networks and execute highly accurate processing. The new chip will add the most advanced computer vision functionality to small form factor, low-power edge devices. Target applications include advanced driver assistance system (ADAS), security camera, and factory automation among others.
Socionext is currently conducting circuitry fine-tuning and performance optimization through the evaluation of this prototype chip. The company will continue working on research and development with the partner companies towards the completion of the NEDO-commissioned project, to deliver the AI Edge LSI as the final product.

NEDO Project title:
Project for Innovative AI Chips and Next-Generation Computing Technology Development
Development of innovative AI edge computing technologies
Updatable and Low Power AI-Edge LSI Technology Development


They also have products like what has been covered previously:

4th Generation Smart Graphic Display Controllers Enable Panoramic and Multi-displays​

Langen, Germany, Milpitas, Calif., and Yokohama, Japan, July 15, 2022 --- Socionext, a global leader in the design and development of innovative System-on-Chip products, has announced a new series of smart display controllers, “SC1721/ SC1722/ SC1723 Series”, certified with ISO26262 for functional safety. Samples will be available at the end of July 2022.

The automotive industry is currently undergoing major transformations that occur approximately once every 100 years. The E/E (Electrical/Electronic) architecture, which is the system structure of automobiles, is changing from a distributed architecture to a domain/zone architecture. Automakers are adopting integrated cockpit systems linking multiple displays, such as meters, In-Vehicle Infotainment (IVI), and head-up displays. Larger display sizes and screen resolutions are also driving the demand for improved image quality. Due to the changes, complying with the ISO26262 functional safety standard is critical for developing new automotive ADAS and infotainment systems.

Socionext improves vehicle safety by adding a mechanism to monitor external LED driver error detection and internal algorithm and supports functional safety (ASIL-B) by complying with the ISO26262 development process.

These features enable new architectures, such as panoramic displays for dashboards, to meet a growing trend of larger multi-display applications.

Trust all had a safe, happy and enjoyable Christmas.

Just been having a look at Socionext again and the SyncQuacer set up that we appeared to be tied in with back in 2020 as per above post.

Part of the reason is I found something by TI (Texas Inst) from early Sept to do with a patch in the code of their AM62A family of SoCs and there is a line within the code referencing SyncQuacer.

This is to do with their Sitara processor.

When you also look at the product structure there are references to the same processor cores as the SyncQuacer.

Not to say we are definitely involved here however there are some dots between our earlier work with Socionext & the SyncQuacer, TI and the SyncQuacer and its design and end use.

Some links and snips etc below to have a look through.



1672064689626.png

1672064786461.png


Some other code info links with Socionext in there as well.





1672064391185.png




1672064522300.png



1672064557248.png




1672064981257.png



The low-cost AM62x Sitara™ MPU family of application processors are built for Linux® application development. With scalable Arm® Cortex®-A53 performance and embedded features, such as: dual-display support and 3D graphics acceleration, along with an extensive set of peripherals that make the AM62x device well-suited for a broad range of industrial and automotive applications while offering intelligent features and optimized power architecture as well.

Some of these applications include:

  • Industrial HMI
  • EV charging stations
  • Touchless building access
  • Driver monitoring systems
AM62x Sitara™ processors are industrial-grade in the 13 x 13 mm package (ALW) and can meet the AEC-Q100 automotive standard in the 17.2 x 17.2 mm package (AMC). Industrial and Automotive functional safety requirements can be addressed using the integrated Cortex-M4F cores and dedicated peripherals, which can all be isolated from the rest of the AM62x processor.


Socionext also offers a high-efficiency, parallel multi-core processor SynQuacerTM SC2A11 as a server solution for various applications. Socionext’s processor is available now and the two companies expect the Akida SoC engineering samples to be available in the third quarter of 2020.

In addition to integrating BrainChip’s AI technology in an SoC, system developers and OEMs may combine BrainChip’s proprietary Akida device and Socionext’s processor to create high-speed, high-density, low-power systems to perform image and video analysis, recognition and segmentation in surveillance systems, live-streaming and other video applications.



1672065320604.png




Also found that Plumerai run their Algo / AI on the Sitara set up and curious if all interconnected at all :unsure:


1672065715677.png
 
  • Like
  • Love
  • Thinking
Reactions: 52 users

Sirod69

bavarian girl ;-)
This recent "poking" of and subsequent commenting on a Linkedin poster is not productive in my opinion. There is no upside because those involved are escalating a silly non event into a potential fight. Why? Because the people involved are males (I think) with ego's and what starts out with a innocent question, claim, or comment, regardless of its validity, soon becomes a challenge....not unlike a group of boys gathering together and saying, "yeah?" ..."wanna bet?" ...."my dad can beat your dad", and so on. In this instance, the linkedin poster (the MB engineer) took his ball and went home rather than engaging or playing with Mr. Chapman.

Clearly, to me at least, any backing and forthing between well intentioned Brainchip cheerleading stockholders and ANYONE whether on linkedin, twitter or whatever, has got to be a cringe worthy moment for Brainchip executives if they are aware of it. And I believe they are aware of Mr. Chapman, and I believe they are aware of the MB engineer so they are aware. Ergo,...a possible cringeworthy moment which our swamped management team does not need.

Whether Mr. Chapman and others are right is not the point. In fact, I agree with them. But it is the public facing venue that has me concerned.

What we all want will be accomplished in time if we just let our company do what it is doing and refrain from poking someone,...anyone, with an association to our (Brainchips) customers - real, implied, hoped for or speculated about. This is an open stock forum....I get it, but in my opinion the risk of poking people on linkedin, or on this site, so that they literally block you (!!!) could be huge.

No one here has any idea of who may be watching sites such as linkedin.

Personally, I think we as quasi ambassadors for Brainchip need to act responsibly and be on our best public facing behavior, whether someone says their GPU dad can beat your Akida dad with one hand tied behind his back. Feel chagrined, but turn the other cheek.

We are sooooo close to what we all want. There are many non believers of Brainchip or just not aware of our technology out there, but what do we really ever gain by saying, or implying, that "my dad can whup your dad" ? And though in this case we are right about our relationship with MB this one person apparently employed by MB threatened to block someone for pointing that out. Think about that FACT.

That's my opinion, ....and now I expect a few of you may want to whup on me, metaphorically speaking.
dippY

"
Of course I've now also read your discussion about this Francois, *laughs, after I've really paused for two days.
Well, he definitely didn't unsubscribe and Anil Mankar also replied to his original post, which I already found interesting. Now I'll take a closer look at BMW again to see if I can find anything about it, I thought it was very nice what you wrote there, "BMW comes from Bavaria", you know ;-)
Oh, by the way, this Herr Francois wouldn't bother me any more.

anil.jpg
 
  • Like
  • Fire
  • Love
Reactions: 17 users

Sirod69

bavarian girl ;-)
On LinkedIn i found connections from some BMW employees to NVidia and Prophesee employees.

till now nothing important


BMW: Man and machine are networked

For trade fair visitors, it could be a pretty crazy experience: you put on VR glasses, immerse yourself in a virtual world, but drive a real car in the real world. "Mixed Reality" is the name of the topic being demonstrated on a closed-off site in Las Vegas. It is fitting that the Munich-based company is giving a new perspective on future technologies, networking between man and machine, sustainability and entertainment systems.

The second topic at BMW is likely to be more tangible. The manufacturer is planning its "New Class" for 2025. The name is based on historical models such as the 1500, which was built between 1962 and 1972 - but the new "New Class" will be a platform on which BMW will build its electric vehicles in the future. BMW boss Oliver Zipse will give a keynote speech at the fair.


This the article about Dee:

BMW To Finally Reveal What Dee Is All About On January 5 At CES


According to another cryptic video released on social media, BMW announces we’ll get to meet Dee on January 5 during the first day of CES 2023. It likely has something to do with artificial intelligence and may or may not be linked to the already confirmed concept car the luxury brand will showcase in Las Vegas. During the meeting held in early November to present the Q3 2022 quarterly report, CFO Dr. Nicolas Peter said a new Vision concept is coming to CES

 
  • Like
  • Fire
Reactions: 28 users

Boab

I wish I could paint like Vincent
Slim pickings I think, but the bit about the lessening of any blurring caught my eye.

 
  • Like
Reactions: 12 users

TECH

Regular
Brainchip Inc.....A global company

Brainchip Inc.....An International IP Supplier

Brainchip Inc.....World-Leader of Edge AI

Expect our company to receive some acknowledgements at the fast-approaching CES, not only from Mercedes Benz.

It's also important to acknowledge that we, Brainchip, have had a very busy year, many positive connections have been sealed
with a handshake, the coming year I personally expect to see much progress, with more companies joining us, product/s starting
to appear in the marketplace, over the next 24 months I also expect growth to really ramp up.

My opinion only.

Love Brainchip :love::geek:
 
  • Like
  • Love
  • Fire
Reactions: 75 users
A Christmas thought bubble.

Assume you are head of the foundry arm of a major technology company.

Assume you are considering the inclusion of a revolutionary one of a kind neuromorphic technology IP from a small Australian based technology company into your portfolio.

Would you simply accept the small Aussie company’s assurances as to what it does or would you require it to be internally tested and validated?

If as seems logical you would require it to be tested and validated internally would you:

a) Ask your Von Neumann compute experts to undertake the test and validation; or

b) Ask your Neuromorphic research arm under the control of a world renowned neuromorphic technology thought leader to undertake the test and validation.

Having most likely chosen option b). would you not also ask the Australian company to provide it’s claimed performance figures so that the testing and validation can be undertaken in some sort of context.

Assuming the end result is that the decision is favourable to the Australian company and its IP is to be included in your foundry IP portfolio would you not also require as part of the technical details you will make available to customers interested in the IP the results of benchmarking so you can allow customers to weigh up the relative advantage and disadvantage of adopting any particular IP.

Once again would you simply accept the benchmarking results from the Australian company or would you ask your internal experts to verify that these performance comparisons are accurate.

Again I think it likely you would be reluctant to simply accept the word of the Australian company and would ask for confirmation of the benchmarks by your own internal experts.

So taking the above train of thought and applying it to a known fact, being that, the Australian company Brainchip has told shareholders it is working on benchmarking AKIDA. Would it not seem likely that the results have been provided to Intel Foundry and that Intel have also verified the results prior to announcing the inclusion of the AKIDA IP.

This being so then could it not be that one part of the Rob Telson ‘more to come’ at CES 2023 might be the public release of the AKIDA technology benchmarking as verified by Intel???

My speculation only so DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 98 users

SERA2g

Founding Member
Slightly off topic, Merry Christmas all. Have kept the phone use to a minimum these past few days so have missed a bit on the forum but did check in once daily to read the top comment.

Appreciate the effort everyone puts in here, especially while we’re all on holidays!

Hoping everyone stays safe over the break and has a happy and healthy 2023.

May it be the year of akida.

Cheers!
 
  • Like
  • Love
  • Fire
Reactions: 45 users

SERA2g

Founding Member
Speck

“Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture”

Edit to add: im not suggesting speck contains akida. It and it’s sister SoC in Xylo have already been discussed here and are not akida. Just found the paper interesting so shared the context with it.


4D1C8151-65E5-40B5-964B-BAFE882D6E91.png


Here’s a link to the paper supported by synsense and IBM - https://www.frontiersin.org/articles/10.3389/fnins.2022.1068193/full

I find it interesting that a paper can be written on spiking neural networks but not reference Peter van der Made’s many peer reviewed works.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 16 users
Speck

“Speck™ is a fully event-driven neuromorphic vision SoC. Speck™ is able to support large-scale spiking convolutional neural network (sCNN) with a fully asynchronous chip architecture”

Edit to add: im not suggesting speck contains akida. It and it’s sister SoC in Xylo have already been discussed here and are not akida. Just found the paper interesting so shared the context with it.


View attachment 25360

Here’s a link to the paper supported by synsense and IBM - https://www.frontiersin.org/articles/10.3389/fnins.2022.1068193/full

I find it interesting that a paper can be written on spiking neural networks but not reference Peter van der Made’s many peer reviewed works.
It might only be a small thing but in the court rooms of common law countries such as Australia it is considered very bad form to selectively quote from source documents and always considered an attempt to mislead the court.

Brainchip bless their little cotton socks unlike a certain company who shall remain named above have on their website the link to an Article from the EETimes titled 'Cars that think like you' however they do not edit the article to remove those parts which cover a competitor nor do they take the title of the article and rewrite it to be 'SynSense. Cars that think like you' so that it appears to be saying that their product creates a car that 'thinks like you' and in so doing mislead the reader and plagiarise the catch phrase of Mercedes Benz.

To address this shortfall in ethics I have extracted the balance of the article which the above named company saw fit to edit out and rename:

"BRAINCHIP AKIDA

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS

Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

Neuromorphic Car Mercedes EQXX Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)


“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.

Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.

“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”

The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

BrainChip Akida neuromorphic processor in car system
BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.

“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.

BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).

Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Neuromorphic Car Mercedes EQXX interior

Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains."


You only get one chance in this world to be considered as ethical to the core of your existence and unfortunately this chance has been thrown away.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 50 users
It might only be a small thing but in the court rooms of common law countries such as Australia it is considered very bad form to selectively quote from source documents and always considered an attempt to mislead the court.

Brainchip bless their little cotton socks unlike a certain company who shall remain named above have on their website the link to an Article from the EETimes titled 'Cars that think like you' however they do not edit the article to remove those parts which cover a competitor nor do they take the title of the article and rewrite it to be 'SynSense. Cars that think like you' so that it appears to be saying that their product creates a car that 'thinks like you' and in so doing mislead the reader and plagiarise the catch phrase of Mercedes Benz.

To address this shortfall in ethics I have extracted the balance of the article which the above named company saw fit to edit out and rename:

"BRAINCHIP AKIDA

The Mercedes EQXX concept car, debuted at CES 2022, features BrainChip’s Akida neuromorphic processor performing in–cabin keyword spotting. Promoted as “the most efficient Mercedes–Benz ever built,” the car takes advantage of neuromorphic technology to use less power than deep learning powered keyword spotting systems. This is crucial for a car that is supposed to deliver a 620–mile range (about 1,000 km) on a single battery charge, 167 miles further than Mercedes’ flagship electric vehicle, the EQS

Mercedes said at the time that BrainChip’s solution was 5 to 10× more efficient than conventional voice control when spotting the wake word “Hey Mercedes”.

Neuromorphic Car Mercedes EQXX Mercedes’ EQXX concept EV has a power efficiency of more than 6.2 miles per kWh, almost double that of the EQS. (Source: Mercedes)


“Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years,” according to Mercedes. “When applied at scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.”

“[Mercedes is] looking at big issues like battery management and transmission, but every milliwatt counts, and the context of [BrainChip’s] inclusion was that even the most basic inference, like spotting a keyword, is important when you consider the power envelope,” Jerome Nadel, chief marketing officer at BrainChip, told EE Times.

Nadel said that a typical car in 2022 may have as many as 70 different sensors. For in–cabin applications, these sensors may be enabling facial detection, gaze estimation, emotion classification, and more.

“From a systems architecture point of view, we can do it in a 1:1 way, there’s a sensor that will do a level of pre–processing, and then the data will be forwarded,” he said. “There would be AI inference close to the sensor and… it would pass the inference meta data forward and not the full array of data from the sensor.”

The idea is to minimize the size and complexity of data packets sent to AI accelerators in automotive head units, while lowering latency and minimizing energy requirements. With a potential for 70 Akida chips or Akida–enabled sensors in each vehicle, Nadel said each one will be a “low–cost part that will play a humble role,” noting that the company needs to be mindful of the bill of materials for all these sensors.

BrainChip Akida neuromorphic processor in car system
BrainChip sees its neuromorphic processor next to every sensor in a car. (Source: BrainChip)

Looking further into the future, Nadel said neuromorphic processing will find its way into ADAS and autonomous vehicle systems, too. There is potential to reduce the need for other types of power–hungry AI accelerators.

“If every sensor had a limited, say, one or two node implementation of Akida, it would do the sufficient inference and the data that would be passed around would be cut by an order of magnitude, because it would be the inference metadata… that would have an impact on the horsepower that you need in the server in the trunk,” he said.

BrainChip’s Akida chip accelerates spiking neural networks (SNNs) and convolutional neural networks (via conversion to SNNs). It is not tailored for any particular use case or sensor, so it can work with vision sensing for face recognition or person detection, or other audio applications such as speaker ID. BrainChip has also demonstrated Akida with smell and taste sensors, though it’s more difficult to imagine how these sensors might be used in automotive (smelling and tasting for air pollution or fuel quality, perhaps).

Akida is set up to process SNNs or deep learning CNNs that have been converted to the spiking domain. Unlike native spiking networks, converted CNNs retain some information in spike magnitude, so 2– or 4–bit computation may be required. This approach, hwoever, allows exploitation of CNNs’ properties, including their ability to extract features from large datasets. Both types of networks can be updated at the edge using STDP — in the Mercedes example, that might mean retraining the network to spot more or different keywords after deployment.

Neuromorphic Car Mercedes EQXX interior

Mercedes used BrainChip’s Akida processor to listen for the keyword “Hey Mercedes” in the cabin of its EQXX concept EV. (Source: Mercedes)
Mercedes has confirmed that “many innovations”, including “specific components and technologies” from the EQXX concept car, will make it into production vehicles, reports Autocar. There is no word yet on whether new models of Mercedes will feature artificial brains."


You only get one chance in this world to be considered as ethical to the core of your existence and unfortunately this chance has been thrown away.

My opinion only DYOR
FF

AKIDA BALLISTA
Leaving ethics to one side you might also ask who is more fearful of their competition SynSense or Brainchip?

I think the evidence supports the conclusion that Brainchip stands in the marketplace ready to take on all competitors face to face, toe to toe.

Others skulk around in the shadows frightened of being compared to AKIDA technology OR of even admitting of its very existence.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Haha
Reactions: 44 users
Top Bottom