BRN Discussion Ongoing

Ybbs

Emerged
Thank you for your prompt reply. I will look for it and buy two. I will continue to follow BRN indefinitely. Only when the SP touches the upper chart line at 7 will I give my wife a shirt.🤫🤫🤫🤫
 
  • Haha
Reactions: 2 users

Ybbs

Emerged
  • Haha
Reactions: 1 users

Diogenese

Top 20
  • Haha
Reactions: 6 users

Ybbs

Emerged
He will come.
 

Rach2512

Regular
 
  • Like
  • Fire
Reactions: 10 users

TheDrooben

Pretty Pretty Pretty Pretty Good

Screenshot_20260301_211400_LinkedIn.jpg
Screenshot_20260301_211453_LinkedIn.jpg
Screenshot_20260301_211504_LinkedIn.jpg
Screenshot_20260301_211356_LinkedIn.jpg


Screenshot_20260301_212402_LinkedIn.jpg


Happy as Kevin
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 19 users

7für7

Top 20
Thank you for your prompt reply. I will look for it and buy two. I will continue to follow BRN indefinitely. Only when the SP touches the upper chart line at 7 will I give my wife a shirt.🤫🤫🤫🤫

Ehhh why should I know if you will give your wife a shirt?
Don’t get it…
Question Mark What GIF by Jukebox Saints


Let’s see who will be triggered
 
  • Haha
Reactions: 1 users

ChrisBRN

Emerged
Last edited:
  • Like
  • Fire
  • Love
Reactions: 36 users

Frangipani

Top 20
As suspected, the recent article 👆🏻 that claimed BrainChip “had secured an order from Nex Novus for medical sensing applications” was wrong:

There will be a live demo of the made-in-Croatia Neuromorphyx Vision NeuroNode at the BrainChip booth at Embedded World 2026 in Nürnberg (10—12 March).

“Experience the Vision NeuroNode™ tiny, compact and rugged form in-person! Featuring the AKD1500 neuromorphic co-processor from Brainchip Inc. pushing the boundaries of high-performance, low-latency, Spiking Neural Networks (SNN) for energy efficient edge AI.

POWER & COMPUTE EFFICIENCY < 300 mW while delivering 800 GOPS

ON-CHIP LOCAL MEMORY 1 MB
CLOCK FREQUENCY RANGE 5 – 400 MHz
SILICON PROCESS 22 nm FD-SOI CMOS digital logic
PACKAGE 7×7 mm MFCTFBGA169, 0.5 mm pitch

Find us at:
Hall 5 / Booth 5-213 (10 - 12 March 2026. Nuremberg, Germany)

NeuroNode™​

Deployment time - seconds.
Operational lifetime - years.
Scalability factor - unparalleled.

Rugged build.
Always-on sensing.
Energy-efficient processing.
Battery-powered operation.
Low-latency reactions.

The Vision NeuroNode™ is armed with an FPGA, a neuromorphic AI accelerator and an event-based neuromorphic vision sensor.
Our IP and architecture, allows up to multiple-years* of field operation on a single battery pack integrated in our compact 5x5x5cm IP67 case.

*Depending on scene activity and inference.

We offer solar panels and add-on battery packs as attachments that further prolong the operational life-time and extend capabilities.

  • Fast tracking, equivalent to >10.000 FPS
  • Extreme lightning conditions >120dB
  • Native power efficiency <10mW
Technology developed by Prophesee and Sony.
Skillfully engineered into compact modules by CenturyArks.
We offer a wide range of sensors and lenses to meet our client requirements depending on deployment location, terrain and field conditions.
HD 1280 x 720 Sony IMX636 / IMX646 (via CenturyArks)
VGA 640 x 512 Sony IMX 637 / 647 (via CenturyArks)
320 x 320 ultra low-power GENX320 (via Prophesee)
Explore sensor features in the Sensor options section →

[…]

NeuroHive™​

Vision, Audio, Radar, IMU are different NeuroNodes™ forming the hive mind network. Distributed intelligence, at the edge.

NeuroHive™ provides your team with the ability to gather mission critical intel from the field, live. Scalable and reliable.


Over-the-air SNN model switching.
Tracking target velocity and direction.
Fleet management.
Orchestrated systems.
External API triggers.
Integration to existing systems and comms.”



View attachment 95475 View attachment 95476 View attachment 95477 View attachment 95478 View attachment 95479 View attachment 95480 View attachment 95481 View attachment 95483
View attachment 95482

The Neuromorphyx LinkedIn account is now active:

Saw this earlier today…


6811F6FA-FB0C-457C-B8B3-3E463A03F34D.jpeg



Meanwhile there has been another post, this time mentioning BrainChip:


3F13CFE5-E3A9-4E85-9DA4-BDEFC8B84AE3.jpeg


EF80C7BB-F695-4AF9-8D7A-D0107CB3C2BC.jpeg




5190969C-D888-464D-9B72-B74670743DCF.jpeg




FC4CB4B1-36D0-4DBE-82A9-6703FD679229.jpeg



I also noticed the Nex Novus website looks different than a month ago:

461EB5BE-CFD5-4590-BC9D-C55EC9B604F1.jpeg



As Terroni had already shared back in January, the founder and CEO of Nex Novus - and also Co-Founder and CEO of Neuromorphyx - is Ivan Projić, who resides in Copenhagen but has registered the business in his native Croatia, in Pula, where his Neuromorphyx Co-Founder and CTO Filip Gembec lives:




C319F8FD-830E-4FB6-935F-99A659F17A1E.jpeg

D7C70F77-EF85-4873-940E-437950DCE404.jpeg




8DEC3E70-014B-4A2B-A29D-7B33F71253F9.jpeg



E37B9830-CB96-4F2E-8B31-5543B39A54B2.jpeg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

Frangipani

Top 20
Jean-Luc Chatelain, Founder and Managing Partner at Verax Capital Advisors as well as former MD and Global CTO at Accenture Applied Intelligence has been a a long-time friend of BrainChip’s. Many forum readers will be familiar with his name thanks to him getting quoted on the BRN website 👇🏻 or because they’ve listened to a podcast with him.

Earlier today, he reposted a BRN job ad for better reach and wrote
“This is a great opportunity for one who believes, as I do, that neuromorphic computing will have significant contribution to the future of #AI at scale”.


7744C7E0-64A6-4E7A-BE6E-85F064C44A2E.jpeg





45C4B1ED-A227-4DC3-9F65-0E387DA76433.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 27 users
FF

François Piednoel de Normandie 3rd+
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w Edited
Follow

One of the patents I am most proud of in autonomous driving is now pending.To understand it, you first need to understand the statistical view of the world described in the DOGMA paper, authored by my former coworkers:https://lnkd.in/g8tHnar9Once you grasp this, you understand the fundamentals of how to build a system that provides the autonomous-driving stack with a statistical model of the environment around the car, allowing you to quantify how reliable the statistics are for every single volumetric cell (“cube”) in Δ-space around the vehicle.This approach is extremely powerful for sensor fusion: combining multiple LiDARs, radars, and cameras while explicitly reasoning about the reliability of each measurement. It can significantly reduce well-known side effects of LiDARs and radars such as highly reflective surfaces because those measurements are statistically weak and therefore down-weighted.However, this typically requires a large and expensive LiDAR setup, like the one used by Waymo.The new pending patent takes a different approach. It uses a set of fixed, single-point LiDARs, roughly $2 devices. When arranged properly, these sensors re-increase the statistical confidence along entire lines of volumetric cells by introducing additional independent measurements. This dramatically reduces the number of expensive LiDAR units needed by replacing them with far more affordable ones, while preserving statistical robustness.And yes, the patent also references an EQXX-like shape, the current record holder for EV range:https://lnkd.in/gg4QnXJn(I am genuinely in love with this car, and I hope Mercedes-Benz will one day produce it commercially.)Importantly, this sensor fusion is not intended to directly drive the car. Its purpose is to verify the trajectories proposed by the machine-learning driving stack and to reject any trajectory that does not meet a confidence threshold of 1 chance in 100 million of being wrong (100× ISO 26262).To perform this work end-to-end, you need very special hardware. A GPU is not suitable here: burning 300 W while carrying severe safety limitations makes it fundamentally incompatible with this level of reliability.The full patent portfolio will be published in 2026.I hope you’ll enjoy the ride.


34
6 Comments
2 Reposts



Heiner Stockmanns
Semiconductor Executive — retired and having fun

2w

Keep going and I cheer you on. Thoroughly enjoying folks tackling extremely challenging problems while striving for ultimate safety and affordability. Never stop differentiating yourself from the pretenders.


Like

Reply
1 Reaction
Andrew Bullen
Retired - BlackBerry Shareholder QNX Supporter

2w

François Piednoel de Normandie - very cool !!!


Like

Reply
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!

2w

Can you utilise other hardware yet which utilises less power? Maybe something neuromorphic?


Like

Reply
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w

Kieran Ryan Yes , there is an hardware scheduler


Like
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!

2w

François Piednoel de Normandie so the stack works independent of the processor controlling it. That’s very helpful


Like
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w

Kieran Ryan , no, we avoid any non-deterministic solution, as it is what is used to certify the safety side of our hardware + software.


Like
1 Reaction
(Reposted by a neuromorphic computing engineer at Mercedes Benz)
 
  • Like
  • Fire
  • Love
Reactions: 9 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
Reactions: 3 users

Bravo

Meow Meow 🐾
FF

François Piednoel de Normandie 3rd+
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w Edited
Follow

One of the patents I am most proud of in autonomous driving is now pending.To understand it, you first need to understand the statistical view of the world described in the DOGMA paper, authored by my former coworkers:https://lnkd.in/g8tHnar9Once you grasp this, you understand the fundamentals of how to build a system that provides the autonomous-driving stack with a statistical model of the environment around the car, allowing you to quantify how reliable the statistics are for every single volumetric cell (“cube”) in Δ-space around the vehicle.This approach is extremely powerful for sensor fusion: combining multiple LiDARs, radars, and cameras while explicitly reasoning about the reliability of each measurement. It can significantly reduce well-known side effects of LiDARs and radars such as highly reflective surfaces because those measurements are statistically weak and therefore down-weighted.However, this typically requires a large and expensive LiDAR setup, like the one used by Waymo.The new pending patent takes a different approach. It uses a set of fixed, single-point LiDARs, roughly $2 devices. When arranged properly, these sensors re-increase the statistical confidence along entire lines of volumetric cells by introducing additional independent measurements. This dramatically reduces the number of expensive LiDAR units needed by replacing them with far more affordable ones, while preserving statistical robustness.And yes, the patent also references an EQXX-like shape, the current record holder for EV range:https://lnkd.in/gg4QnXJn(I am genuinely in love with this car, and I hope Mercedes-Benz will one day produce it commercially.)Importantly, this sensor fusion is not intended to directly drive the car. Its purpose is to verify the trajectories proposed by the machine-learning driving stack and to reject any trajectory that does not meet a confidence threshold of 1 chance in 100 million of being wrong (100× ISO 26262).To perform this work end-to-end, you need very special hardware. A GPU is not suitable here: burning 300 W while carrying severe safety limitations makes it fundamentally incompatible with this level of reliability.The full patent portfolio will be published in 2026.I hope you’ll enjoy the ride.


34
6 Comments
2 Reposts



Heiner Stockmanns
Semiconductor Executive — retired and having fun

2w

Keep going and I cheer you on. Thoroughly enjoying folks tackling extremely challenging problems while striving for ultimate safety and affordability. Never stop differentiating yourself from the pretenders.


Like

Reply
1 Reaction
Andrew Bullen
Retired - BlackBerry Shareholder QNX Supporter

2w

François Piednoel de Normandie - very cool !!!


Like

Reply
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!

2w

Can you utilise other hardware yet which utilises less power? Maybe something neuromorphic?


Like

Reply
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w

Kieran Ryan Yes , there is an hardware scheduler


Like
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!

2w

François Piednoel de Normandie so the stack works independent of the processor controlling it. That’s very helpful


Like
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member

2w

Kieran Ryan , no, we avoid any non-deterministic solution, as it is what is used to certify the safety side of our hardware + software.


Like
1 Reaction
(Reposted by a neuromorphic computing engineer at Mercedes Benz)


Hi Smoothy,

It looks like the comments may not be sequenced correctly in your post.

From what I can see (screenshot below), when Kieran Ryan asks: "Can you utilise other hardware yet which utilises less power? Maybe something neuromorphic?"

Piednoel responds "No, we avoid any non-deterministic solution, as it is what is used to certify the safety side of our hardware + software. "

Out of curiosity, I asked ChatGPT what it made of that exchange. In summary, it suggested:
  • Neuromorphic architectures are generally considered non-deterministic in behaviour (at least from a safety certification perspective).
  • At this point in time neuromorphic architectures don't meet automotive safety standards (ISO 26262, ASIL) prioritise deterministic, repeatable execution paths.
  • If the system is part of a safety-certified validation layer, engineers would typically avoid architectures that are difficult to formally verify.
  • His mention of a “hardware scheduler” suggests a deterministic co-processor or ASIC approach rather than a neuromorphic one.
ChatGPT reckons it doesn’t necessarily rule neuromorphic out for other parts of a stack in future (5-7 years), but for the safety-certified path he’s describing, it sounds like Piednoel is deliberately avoiding anything that could be perceived as non-deterministic.

I think ChatGPT might be being a bit conservative about the 5-7 year estimate for automotive adoption since Mercedes have commented previously, unless I'm mistaken, that they're looking at 2030 time-frame.

Happy to hear alternative interpretations, but that’s how it reads to me.




Screenshot 2026-03-02 at 10.28.10 am.png
 
  • Like
  • Love
  • Fire
Reactions: 8 users
Hi Bravo...... it's all way over my pay grade, cut and paste only.
However I noticed the below was the interesting part for myself.

Like
1 Reaction
(Reposted by a neuromorphic computing engineer at Mercedes Benz)
 
  • Like
  • Fire
Reactions: 5 users

7für7

Top 20
And the silence continues… and the shorter are happy and thankful for the spike on Friday!

Blockbuster style

Standing Ovation Clapping GIF by The Academy Awards
 
  • Like
Reactions: 1 users

Gazzafish

Regular
This sure sounds like Akida. Any links to us???



Type 1 Compute | Brain-inspired processors for edge AI





Extract :-“



The Event-Driven Advantage


Traditional cameras capture full frames 30 times per second—wasting power processing unchanged pixels. Event-driven sensors fire only when pixels detect change, enabling:

SPACE SURVEILLANCE: Track satellites moving 17,000 mph against bright Earth or dark space simultaneously—impossible for frame cameras that saturate or lose contrast.

This same principle scales to:

  • •Autonomous vehicles navigating at 45+ mph
  • •Industrial systems detecting vibration frequencies
  • •Drones adapting to their environment in real-time
  • •Robotics requiring instant obstacle detection
By processing only what changes, we achieve 10,000 fps equivalent performance at <2W.”
 
  • Like
  • Love
Reactions: 11 users

Guzzi62

Regular
And the silence continues… and the shorter are happy and thankful for the spike on Friday!

Blockbuster style

Standing Ovation Clapping GIF by The Academy Awards
Yes, hopefully we will get news this month. Hold on: Didn't we say that the last year or so? We must be getting close for xxxxx sake!

The whole world is in a bear market due to the Middle East crises, UAE stock exchange closed for 2 days.

Oil is a good place to be invested in right now.
 
  • Like
  • Fire
Reactions: 5 users

Guzzi62

Regular
This sure sounds like Akida. Any links to us???



Type 1 Compute | Brain-inspired processors for edge AI





Extract :-“



The Event-Driven Advantage

Traditional cameras capture full frames 30 times per second—wasting power processing unchanged pixels. Event-driven sensors fire only when pixels detect change, enabling:

SPACE SURVEILLANCE: Track satellites moving 17,000 mph against bright Earth or dark space simultaneously—impossible for frame cameras that saturate or lose contrast.

This same principle scales to:

  • •Autonomous vehicles navigating at 45+ mph
  • •Industrial systems detecting vibration frequencies
  • •Drones adapting to their environment in real-time
  • •Robotics requiring instant obstacle detection
By processing only what changes, we achieve 10,000 fps equivalent performance at <2W.”
I can't find anything Akida, sadly.

In your link there are some white papers which you can see below




A LinkedIn post from Type1 Compute below, no reactions from BRN staff.

 
  • Like
Reactions: 4 users

IloveLamp

Top 20
  • Haha
Reactions: 3 users

manny100

Top 20
Its great validation to see IMB via Kevin Johnson produce 'eye watering' results with AKIDA1000.
Its like watching reality TV, we get blow by blow accounts of the action.
We get marketing, validation, a big business thumbs up and IBM gets a chance to be first mover in the industry and widen its Symphony moat.
So we have an IBM/Kevin Johnson/AKIDA1000 love story and its all about ROI.
What next.
I am tipping that pretty soon Brainchip will make it a foursome via a partnership with IBM.
The partnership will likely be in the lead up to the AGM.
A pre AGM partnership with IBM may not get the BOD a standing ovation but it should see a change of mood.
With results that Kevin is talking about an IBM partnership should bring home the bacon plus stir up a lot of other interest in the finance/insurance industry where they are all looking for an edge.
Imagine what Kevin can do with AKIDA1500 and the 2500.
 
  • Like
  • Love
  • Fire
Reactions: 15 users
Top Bottom