He too will find the right place to sniff around.In yoga, we call that the inverse downward dog.
We usually use a lamp post for support.He too will find the right place to sniff around.
Thank you for your prompt reply. I will look for it and buy two. I will continue to follow BRN indefinitely. Only when the SP touches the upper chart line at 7 will I give my wife a shirt.![]()
As suspected, the recent articlethat claimed BrainChip “had secured an order from Nex Novus for medical sensing applications” was wrong:
There will be a live demo of the made-in-Croatia Neuromorphyx Vision NeuroNode at the BrainChip booth at Embedded World 2026 in Nürnberg (10—12 March).
“Experience the Vision NeuroNode™ tiny, compact and rugged form in-person! Featuring the AKD1500 neuromorphic co-processor from Brainchip Inc. pushing the boundaries of high-performance, low-latency, Spiking Neural Networks (SNN) for energy efficient edge AI.
POWER & COMPUTE EFFICIENCY < 300 mW while delivering 800 GOPS
ON-CHIP LOCAL MEMORY 1 MB
CLOCK FREQUENCY RANGE 5 – 400 MHz
SILICON PROCESS 22 nm FD-SOI CMOS digital logic
PACKAGE 7×7 mm MFCTFBGA169, 0.5 mm pitch
Find us at:
Hall 5 / Booth 5-213 (10 - 12 March 2026. Nuremberg, Germany)
NeuroNode™
Deployment time - seconds.
Operational lifetime - years.
Scalability factor - unparalleled.
Rugged build.
Always-on sensing.
Energy-efficient processing.
Battery-powered operation.
Low-latency reactions.
The Vision NeuroNode™ is armed with an FPGA, a neuromorphic AI accelerator and an event-based neuromorphic vision sensor.
Our IP and architecture, allows up to multiple-years* of field operation on a single battery pack integrated in our compact 5x5x5cm IP67 case.
*Depending on scene activity and inference.
We offer solar panels and add-on battery packs as attachments that further prolong the operational life-time and extend capabilities.
Technology developed by Prophesee and Sony.
- Fast tracking, equivalent to >10.000 FPS
- Extreme lightning conditions >120dB
- Native power efficiency <10mW
Skillfully engineered into compact modules by CenturyArks.
We offer a wide range of sensors and lenses to meet our client requirements depending on deployment location, terrain and field conditions.
HD 1280 x 720 Sony IMX636 / IMX646 (via CenturyArks)
VGA 640 x 512 Sony IMX 637 / 647 (via CenturyArks)
320 x 320 ultra low-power GENX320 (via Prophesee)
Explore sensor features in the Sensor options section →”
[…]
NeuroHive™
Vision, Audio, Radar, IMU are different NeuroNodes™ forming the hive mind network. Distributed intelligence, at the edge.
NeuroHive™ provides your team with the ability to gather mission critical intel from the field, live. Scalable and reliable.
Over-the-air SNN model switching.
Tracking target velocity and direction.
Fleet management.
Orchestrated systems.
External API triggers.
Integration to existing systems and comms.”
Neuromorphyx™ - Edge AI Networks
Networks of neuromorphic edge AI nodes. Distributed by hardware. Unified by software. Sense. Think. Decide. On the edge. NeuroNodes™ managed via NeuroHive™.neuromorphyx.com
View attachment 95475 View attachment 95476 View attachment 95477 View attachment 95478 View attachment 95479 View attachment 95480 View attachment 95481 View attachment 95483
View attachment 95482
Same same!!!!That it will, i haven't lost any faith, the better half is another story lol
FF
François Piednoel de Normandie 3rd+
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member
2w Edited
Follow
One of the patents I am most proud of in autonomous driving is now pending.To understand it, you first need to understand the statistical view of the world described in the DOGMA paper, authored by my former coworkers:https://lnkd.in/g8tHnar9Once you grasp this, you understand the fundamentals of how to build a system that provides the autonomous-driving stack with a statistical model of the environment around the car, allowing you to quantify how reliable the statistics are for every single volumetric cell (“cube”) in Δ-space around the vehicle.This approach is extremely powerful for sensor fusion: combining multiple LiDARs, radars, and cameras while explicitly reasoning about the reliability of each measurement. It can significantly reduce well-known side effects of LiDARs and radars such as highly reflective surfaces because those measurements are statistically weak and therefore down-weighted.However, this typically requires a large and expensive LiDAR setup, like the one used by Waymo.The new pending patent takes a different approach. It uses a set of fixed, single-point LiDARs, roughly $2 devices. When arranged properly, these sensors re-increase the statistical confidence along entire lines of volumetric cells by introducing additional independent measurements. This dramatically reduces the number of expensive LiDAR units needed by replacing them with far more affordable ones, while preserving statistical robustness.And yes, the patent also references an EQXX-like shape, the current record holder for EV range:https://lnkd.in/gg4QnXJn(I am genuinely in love with this car, and I hope Mercedes-Benz will one day produce it commercially.)Importantly, this sensor fusion is not intended to directly drive the car. Its purpose is to verify the trajectories proposed by the machine-learning driving stack and to reject any trajectory that does not meet a confidence threshold of 1 chance in 100 million of being wrong (100× ISO 26262).To perform this work end-to-end, you need very special hardware. A GPU is not suitable here: burning 300 W while carrying severe safety limitations makes it fundamentally incompatible with this level of reliability.The full patent portfolio will be published in 2026.I hope you’ll enjoy the ride.
34
6 Comments
2 Reposts
Heiner Stockmanns
Semiconductor Executive — retired and having fun
2w
Keep going and I cheer you on. Thoroughly enjoying folks tackling extremely challenging problems while striving for ultimate safety and affordability. Never stop differentiating yourself from the pretenders.
Like
Reply
1 Reaction
Andrew Bullen
Retired - BlackBerry Shareholder QNX Supporter
2w
François Piednoel de Normandie - very cool !!!
Like
Reply
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!
2w
Can you utilise other hardware yet which utilises less power? Maybe something neuromorphic?
Like
Reply
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member
2w
Kieran Ryan Yes , there is an hardware scheduler
Like
Kieran Ryan
Medical Device Expert. Trainer, Inventor, Market Developer and Clinical Cover!
2w
François Piednoel de Normandie so the stack works independent of the processor controlling it. That’s very helpful
Like
François Piednoel de Normandie
Athos Silicon Cofounder, ex-Performance Gurus of Glorious Intel. Ex-Mercedes Benz ADAS hardware Architect. IEEE Member
2w
Kieran Ryan , no, we avoid any non-deterministic solution, as it is what is used to certify the safety side of our hardware + software.
Like
1 Reaction
(Reposted by a neuromorphic computing engineer at Mercedes Benz)
Yes, hopefully we will get news this month. Hold on: Didn't we say that the last year or so? We must be getting close for xxxxx sake!And the silence continues… and the shorter are happy and thankful for the spike on Friday!
Blockbuster style
![]()
I can't find anything Akida, sadly.This sure sounds like Akida. Any links to us???
Type 1 Compute | Brain-inspired processors for edge AI
Extract :-“
The Event-Driven Advantage
Traditional cameras capture full frames 30 times per second—wasting power processing unchanged pixels. Event-driven sensors fire only when pixels detect change, enabling:
SPACE SURVEILLANCE: Track satellites moving 17,000 mph against bright Earth or dark space simultaneously—impossible for frame cameras that saturate or lose contrast.
This same principle scales to:
By processing only what changes, we achieve 10,000 fps equivalent performance at <2W.”
- •Autonomous vehicles navigating at 45+ mph
- •Industrial systems detecting vibration frequencies
- •Drones adapting to their environment in real-time
- •Robotics requiring instant obstacle detection
We usually use a lamp post for support.