BRN Discussion Ongoing

Don't recall seeing this posted prev but it was from a May 22 paper so quite possibly just missed it.

We get a write up in it as "another relevant approach".

Source Doc

Screenshot_2023-04-28-22-45-45-24_e2d5b3f32b79de1d45acd1fad96fbb0f.jpg
Screenshot_2023-04-28-22-45-09-93_e2d5b3f32b79de1d45acd1fad96fbb0f.jpg
 
  • Like
  • Fire
  • Love
Reactions: 30 users
Also hadn't read this article previously either. Posted only a few weeks back and Nandan has a section in the interview.

Some may have seen it...or not.

Found it interesting that another interviewee were Synaptics given their field / products and that their founders (30 yrs ago) were tied to neuromorphic eg Carver Mead was one.


Musing if any potential (future) relationship opp here :unsure:



How smart will the edge get?​

By Gary HilsonApr 6, 2023 10:00am
SynapticsBrainChipITTIASmart Edge
Share
country image
Local, as in edge, machine learning is more feasible now than ever, but not every application works that way. (Getty Images)
What goes around comes around: after two decades of moving data and applications to a central cloud to be accessed by “dumb terminals,” the tide has turned. The edge is getting smarter, but how smart can it get?


Intelligence at the edge could be as simple as running analytics on data without having to send it back to a central data center, or even artificial intelligence (AI) to do simple inference. But until recently, the best practice was that the AI training and machine learning would be done in the cloud – now low power hardware is making it feasible for some of that machine learning to be done on a local edge device.

Being able to do more at the edge has some obvious advantages – you don’t need to consume energy and network capacity to send the data back and forth, nor do you have to worry about securing the data while in transit. And with some AI functionality becoming somewhat ubiquitous and affordable for most businesses, it appears inevitable that the edge will keep getting smarter.

Automation drives smart edge adoption

Embedded systems and devices are slowly taking steps to replace people, ITTIA founder Sasan Montaseri said in an interview with Fierce Electronics. “We allow them to capture and digest as much data as possible data and process that really rapidly.” This shift, he said, is exemplified by autonomous cars and robotics in factories – embedded systems and devices can handle and store more data to aid with tasks such as preventative maintenance. It is accomplished by looking at patterns of data, which makes the edge more intelligent.

Montaseri said one factor that is driving intelligence at the edge is that connectivity is not a hundred percent available, which means delays getting information back from the cloud. Another is that microprocessors and micro controllers are becoming more powerful. This enables the necessary data management at the edge, he said, and allows to devices quickly analyze and make sense of data.

ITTIA is focused on providing the software necessary for edge data streaming, analysis and management for embedded systems and IoT devices – robust data management is foundational for doing AI and machine learning in embedded systems at the edge, Montaseri said.

diagram

ITTIA provides software for edge data streaming, analysis and management for embedded and IoT for uses such as transportation when it's not feasible to depend on a central cloud. (ITTIA)
Reliability is critical for smart edge devices, he added, whether it’s for industrial robotics, medical or transportation applications. “You want to make sure that they don't go down.”

What’s also becoming apparent is that not all edges are created equal – some will be smarter sooner than others depending on the use case and industry, such as robotics and medical devices. Montaseri said today’s embedded systems that gain intelligence through IoT deployments will be doing the jobs needed for the next generation of computing. “The nature of everything is changing,” he said. “We are seeing more security, more safety, and more functionality, like the ingestion rate and the query rate. Our focus is safety, security, and reliability.”


Not all edges are created equal

What makes the smart edge murky is the definition of edge, which means different things to different people, Nandan Nayampally, CMO at BrainChip, said in an interview with Fierce Electronics. He was previously at ARM for more than 15 years when the edge was viewed as primarily sensor driven. “That's how IoT kind of cropped up,” he said. “IoT is a sensor plus connectivity plus processing.” While a Dell or an Intel might think of the smart edge as another giant box that’s now smaller, the right starting point to him is IoT with AI.

AI on the edge is a step forward from a sensor just doing one function, with devices now having more storage, memory, and processing power. Nayampally said this battle between cloud and the edge has been going on a for a while, going back to the days of a terminal connected to mainframe before the move to a client/server model. “What you realize is however much we think that latency to cloud or connectivity to cloud is guaranteed, and the bandwidth assured, it's never going to be the case,” he said. “You need that intelligence and computational power at the edge.”

diagram of chip

BrainChip's Akida processor can learn at the edge to address security and privacy while limiting network congestion. (BrainChip)

Having the smarts at the edge is beneficial for preventative maintenance in factories and patient monitoring, Nayampally said, both in terms of latency and privacy. “Anytime you send raw data or sensitive data out, you are obviously going to have challenges.” Privacy and security have become especially important to the general public, he added. BrainChip was started with the idea that edge computing was necessary and that any approach to AI at the edge had to be different from the cloud. “The cloud kind of assumes almost infinite resources and infinite compute.”

While compute resources at the edge are rather finite, more AI is possible due to advances with low power hardware including memory and systems on chip (SoC), which means not all training and machine learning need be shipped back to the cloud. Nayampally said it’s a matter of scaling, with neuromorphic computing offering inspiration for how to low power intelligence at the edge. “Let's try to emulate the efficiency of it and start from there.”

Machine learning will increasingly happen at the edge both because of inherent capability but also out of necessity. Nayampally said some applications that require a real-time response can’t afford the delay between the edge and the cloud, or the power. “Any time you use radio and connectivity, especially to cloud, that burns a lot of power,” he said. “Radios are the most expensive parts of devices.” Smaller, more cost-effective devices may not be able to afford to have connectivity and need to do more compute locally.

Nayampally said the neuromorphic nature of BrainChip’s Akida platform allows it to learn at the edge, which also addresses security and privacy and reduces network congestion – today’s autonomous vehicles can generate a terabyte of data per day, he noted, so it makes sense to be selective about how much data needs to travel the network.

For the smart edge, simple is better and BrainChip’s processor does that from a computational standpoint, as well as from a development and deployment standpoint, Nayampally said. “It's almost like a self-contained processor.” It is neuromorphic and event driven, so it only processes data when needed, and only communicates when needed, he said.

Being event driven is an excellent example of how basic machine learning may express itself in a single device for the user or the environment, which is what Synaptics is calling the “edge of the edge,” said Elad Baram, director of product marketing for low-power edge AI. The company has a portfolio of low power AI offerings operating at a milliwatt scale which is enabling machine learning using minimal processing and minimal learning – an initiative in line with the philosophy of the tinyML Foundation, he said. While an ADAS uses gigabytes of memory, Synaptics is using megabytes of memory.

Baram’s expertise is in computer vision, and Synaptics sees a lot of potential around any kind of sensing and doing the compute right next to where the data is coming from. Moving data requires power and increases latency and creates privacy issues. Organizations like tinyML are an indicator of how smart the edge could get. “We are at an inflection point within this year or next year,” he said. “This inflection point is where it's booming.”

diagram of a chip

Synaptics has a context aware Smart Home SoC with an AI accelerator for 'edge at the edge'. (Synaptics)
Context aware edge remain application specific

Baram said just as the GPU boom occurred in the cloud five years ago, the same evolution is happening with TinyML. Workloads at the edge that previously required an Nvidia processor, such as detection and recognition, can now be done on a microcontroller. “There is a natural progression.”

Sound event detection is already relatively mature, Baram said, starting with Alexa and Siri and progressing to detecting glass breaking or a baby crying. “We are seeing a lot of use cases in smart home and home security around the audio space.” In the vision space, he said, Synaptics is supporting “context awareness” for laptops so they can detect whether a user is present or not, and to ensure privacy, any imaging stays on the on-camera module – it’s never stored on the computer’s main processor.

Power, of course, is important for context awareness applications. Baram said. “You don't want the power to the battery to drain too fast.” But having this awareness actually extends battery life, he said, because now the system understands if the user is engaged with the device and its content and can respond accordingly. “You approach the laptop, it's turned on and you log in and it's like magic. The machine just knows what you want to do, what you are doing, and it can adapt itself.”

Similarly, an HVAC system could adapt based on the number of occupants in a room, or a dishwasher could let you know how full it is. Baram said a fridge could be smart enough so you can know whether or not you need to pick up milk on the way home. Aside from smart laptops and home appliances, there are many safety applications in construction, manufacturing and agriculture that could benefit from context awareness. “The amount of use cases out there in the world is pretty amazing.”

Baram said the hardware is coming together to enable the smart edge, including machine learning, while algorithms and networking are also improving significantly. “The neural networks are way more efficient than they were a decade ago.” As compute capabilities advance, devices will be able to have more general purposes, but for now processor and algorithm constraints mean smart edge devices will have targeted applications.

In the long run, making the edge smarter is ultimately contingent on successfully pulling on these elements together, which requires an AI framework, Baram said, such as TensorFlow, an open source ML platform. “Those frameworks make it much easier to deploy a neural network into edge devices.”
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Beebo

Regular
That is also my hope.

As I said before, the recent flurry of ARM announcements (Akida compatibility with all ARM processors, ARM "more advanced" chip manufacture (IFS?), and the fact that SiFive (an up-and-coming competitor of ARM) and Akida are cozy, leads me to hope that the ARM/BrainChip presentation will be that the new ARM chip will incorporate Akida as its AI block.

Some supporting reasons:

1. ARM presently has an in-house AI block called Helium available with its processor IP. Helium is light weight AI compared to Akida, so replacing Helium with Akida would make the ARM chip "more advanced";

2. Sifive and Akida are a good fit and would give SiFive an advantage over present ARM processors, and ARM will need to swallow any "not-invented-here" attitude they may have if they are to attempt to keep up with SiFive's more efficient RISC-V architecture;

3. BrainChip has joined the ARM partnership group;

4. BrainChip and ARM have both joined the Intel Foundry Services (IFS) fellowship;

5. Why would ARM be doing a presentation for a company they barely know?

Of course, the counter-argument is that, since RISC-V is open-source, ARM is bringing out its own RISC-V processor, which would qualify as "more advanced".

Then again, an ARM RISC -V processor could be mated with Akida. That would be very advanced.
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
 
  • Like
  • Fire
  • Thinking
Reactions: 37 users

Beebo

Regular
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
Akida is potentially secret sauce to ARM when it comes to competing with QCOM.

Onward and upward!
 
  • Like
  • Fire
  • Wow
Reactions: 33 users
I’ve always pondered a theory where ARM initially wanted exclusive rights to Akida, and BrainChip declined. BrainChip then got cozy with SiFive, and that tipped the balance for ARM to deal with BrainChip non-exclusively.

I expect both ARM and SiFive to license IP in the 4th Quarter, if not sooner!
Funny you just raised that part about SiFive / Risc-V.

Was just reading this recent article and below is a snip from it haha

Link to full article bottom of post.

Inside Arm’s vision for the ‘software-defined vehicle’ of the future​

The chip giant is betting big on cars​


April 11, 2023 - 12:33 pm


“One executive I was talking to said: ‘The best negotiating strategy when Arm comes in is to have a RISC-V brochure sitting on my desk’,” Jim Feldhan, the president of semiconductor consultancy Semico Research, said last year. “It’s a threat. Arm is just not going to have its super dominant position in five or 20 years.”


 
  • Like
  • Fire
  • Haha
Reactions: 26 users

IloveLamp

Top 20
Screenshot_20230429_043643_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 38 users

IloveLamp

Top 20
BMW feeling left out of the party.....how will they "compete" ........

I think i know how..........dyor

Screenshot_20230429_051421_LinkedIn.jpg
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users
It's hard to know if we should punish the directors for simply making a mistake that we all agreed to, at the time.

We all fell for the "let's just sell IP" line - Oh, revenue will be virtually all profit!. In reality this meant that we were going to hand over the heavy sales lifting to Renesas and MegaChip. Surprise surprise, that didn't work.
But it was worth a try.
So now we're going back to selling the actual MCU that engineers can just buy off the shelf.
And we've wasted a year.
We made this mistake because the product is 100% digital, so can be scaled and made cheaply at any of the Fab companies. We were unlucky in that no-one took a punt on us and is selling a product containing the Akida IP. If they had of, we would be in a much better position - others would have had to follow suit. But they didn't.

At any rate we are back to the "let's produce the MCU" tack, a decision which I think we took quickly.
So I'll just be abstaining, worried that any push back will just add to our problems.
I’m not sure you actually have a good grasp of what’s going on. To say that the deals with Renesas and Megachips have failed is completely rubbish. Renesas have taped out their chip containing our IP in dec last year. We signed with Megachips 6 months later than renesas so it’s not surprising there hasn’t been any revenue out of them first.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Foxdog

Regular
  • Like
Reactions: 3 users

Frangipani

Top 20
So is that it

Hi @Makeme 2020,

honestly, what kind of reply did you expect to your provocative rhetorical 🎺question?

In case you didn’t get the hint - I’d actually say my choice of album and song title was a pretty accurate reflection of how a lot of posters in this forum have been feeling today:
Kind of Blue (ranging from disappointment to disbelief with the 4C), yet a defiant
So What? (conviction that Brainchip’s future remains bright despite the 4C seemingly suggesting lack of interest from potential customers, confidence that revenue will eventually come, alas later than hoped for & let’s cross our fingers for some surprise reveals before/at the AGM)

Trumpet players occasionally use mutes to purposely change their instrument’s timbre (tone colour) or lower its volume. So if their sound is a little muffled at times, it doesn‘t mean they‘ve stopped playing altogether. And once in a while they need to take their instrument down and empty the spit valves - but don‘t worry, it‘s mostly water (condensation of the player’s warm moist breath to be precise) and very little actual spit. Also, playing the trumpet can be quite taxing on your lips and you may therefore find it necessary to remove the mouthpiece from your lips from time to time and rest. And last but not least there are those kind of rests that are part of your score and thus intended. The composer may even have chosen to write in a general pause - the absence of sound in all instruments as a powerful means of expression. Sometimes we forget how important silence is in music.

For what it’s worth. The German word for an instrumental mute is “Dämpfer” - this word can also be used metaphorically in the sense of “putting a damper on something“. So while the mood may have been a little subdued today, you should soon be hearing that familiar brilliant sound of trumpets once again, if you choose not to leave the concert hall early, which would indeed be a shame and waste of money in my eyes. And guess what - you are very welcome to join the brass ensemble on stage, playing the trombone or even the tuba, if you prefer that kind of sound over that of a trumpet, as long as your bass line contribution is mostly harmonious - some disharmony is fine, though, and in fact at times even desirable and refreshing:

“Despite their differences, consonance and dissonance tend to work well together in music. Like a good story, tonal music needs conflict to generate tension to drive the story. Dissonance creates that tension in the musical story. The conflict can be but is not required to be, resolved with consonance. Essentially, the composer creates a sense of movement in music by creating tension using dissonant sounds and then releases that tension by returning to consonant sounds.“

Wouldn’t it be gratifying if we all ended up making wonderful music together? After all, aren’t we all in awe of this masterpiece of a composition?

P.S.: Interesting trivia: “Trumpet-like instruments have historically been used as signalling devices in battle or hunting, with examples dating back to at least 1500 BC. They began to be used as musical instruments only in the late 14th or early 15th century.” (Wikipedia)

Doesn’t “Akida Ballista!” sound just like a rousing fanfare? 🎺🎺🎺

I guess there is more than just one way to turn lemons into lemonade.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 14 users

IloveLamp

Top 20
Screenshot_20230429_062630_LinkedIn.jpg
 
  • Like
  • Love
  • Thinking
Reactions: 19 users

IloveLamp

Top 20
  • Like
Reactions: 3 users

Foxdog

Regular
No, I'll just wait for the non-announcement thanks.
 
  • Like
  • Haha
Reactions: 2 users

IloveLamp

Top 20
  • Like
Reactions: 2 users

Foxdog

Regular
SynSense
 
Last edited:
  • Like
Reactions: 1 users

The Pope

Regular
Yes there is merit in what you are saying due to synsense and bmw comments via synsense website news in 15 April 2022. Refer link below


Then there is this from TSE forum in 27 Feb 22 with a potential dot join between BRN and BMW

I recall banter between posters on TSE linked to the above but doubt anyone will try to post links (dot joining) to convince you BRN tech is definitely in BMW upcoming EV range.

Maybe BMW have changed camps in the last 12months with exploring AI tech for their vehicles with BRN. From a quick google there doesn’t appear to be any announcement by BRN they are exploring uses of AI tech with BMW as that noted by synsense in the article above on 15 April 2022
Maybe BRN / Bmw have this under a NDA unlike synsense. Who knows but doesn’t it take a few years to develop tech into products before rolling out to customers

BRN can’t be eveywhere but BRN TSE believers like us wish it will be in BMW and many other car manufactures. Fingers crossed BRN is everywhere for us shareholders.
 
  • Like
  • Love
  • Haha
Reactions: 11 users
Interesting article about differing views on the possibility of fully autonomous (level 5) driving:


"Leading Chinese automaker BYD claims completely autonomous driving (AD) is ‘basically impossible’, and that the automation technology would better serve streamlining manufacturing processes.

Translated from Mandarin by CNBC, BYD spokesperson Li Yunfei said “We think self-driving tech that’s fully separated from humans is very, very far away, and basically impossible.”

"Despite Mercedes-Benz having one of the most advanced ADAS systems on the market, with Level 2 autonomous driving systems across its range and the S-Class being offered with Level 3 autonomous technology in Germany, CEO Ola Kallenius said “I think we will surely be deep into the [20]30s before the whole world goes to that (self-driving tech).”

"Elon Musk says Tesla vehicles will soon achieve full self-driving autonomy and will “be able to show to regulators that the car is safer, much more so, than the average human”.

Tesla is yet to receive regulatory approval for its systems, but as Musk says, “we’ve got to prove it to regulators and get the regulatory approvals, which is outside of our control.”
 
  • Like
Reactions: 8 users

Foxdog

Regular
Yes there is merit in what you are saying due to synsense and bmw comments via synsense website news in 15 April 2022. Refer link below


Then there is this from TSE forum in 27 Feb 22 with a potential dot join between BRN and BMW

I recall banter between posters on TSE linked to the above but doubt anyone will try to post links (dot joining) to convince you BRN tech is definitely in BMW upcoming EV range.

Maybe BMW have changed camps in the last 12months with exploring AI tech for their vehicles with BRN. From a quick google there doesn’t appear to be any announcement by BRN they are exploring uses of AI tech with BMW as that noted by synsense in the article above on 15 April 2022
Maybe BRN / Bmw have this under a NDA unlike synsense. Who knows but doesn’t it take a few years to develop tech into products before rolling out to customers

BRN can’t be eveywhere but BRN TSE believers like us wish it will be in BMW and many other car manufactures. Fingers crossed BRN is everywhere for us shareholders.
Thanks for your considered post Pope. I would absolutely love to see us in BMW and Merc together - what a coup that would be. We partner with Prophesee but they and BMW state a partnership with SynSense only for their neuromorphic smart cockpits. There is no mention of Brainchip. Why would we be under an NDA and not SynSense (makes NoSense). Astonishingly this opinion is considered 'ignorant' by some but perhaps the dog ate their homework instead. Keep up the good work Pope, I enjoy your balanced contributions here 👍
 
  • Like
Reactions: 10 users
D

Deleted member 118

Guest
  • Like
  • Fire
  • Love
Reactions: 10 users
D

Deleted member 118

Guest
  • Like
  • Fire
Reactions: 5 users
Top Bottom