BRN Discussion Ongoing

Tothemoon24

Top 20
IMG_1256.jpeg




Drones have become pervasive in entertainment (TV show/film making), hobbyist photography and just as a fun toy. They are increasingly used in inspection, logistics/delivery, security and surveillance, and other industrial use cases, due to their ability to access hard-to-reach areas. However, did you know that the most critical component enabling a drone’s operation is its vision system? Before diving deeper into this topic, let’s explore what drones are, their diverse applications, and why they have surged in popularity. Finally, we’ll discuss how onsemi is transforming the vision systems that power these incredible flying objects.

Types & Applications​

Drones are unmanned aerial vehicles (UAV), also called unmanned aerial systems (UAS), and to a lesser extent remotely piloted aircrafts (RPA). They can operate without the need to be driven and navigate autonomously using a variety of systems.

There are three types of drones – Fixed Wing, Single Rotor/Multi-Rotor and Hybrid. Each serves a different purpose, and each type is aligned with the intended purpose for which it is built.

Fixed Wings are typically used for heavier payload transport, longer flight times and are deployed in intelligence, surveillance and reconnaissance (ISR) missions, combat operations and loitering munitions, mapping and research activities to mention a few.

Single-/Multi-Rotors have the dominant usage, with a wide variety of industrial focuses that range from regular warehouses to inspections and even as delivery vehicles. The purpose of these types can be varied as they can be deployed in a wide variety of use cases, and demand highly optimized electro-mechanical solutions.

Hybrid Rotors incorporate the best of both types above, and have a vertical take-off and landing (VTOL) ability that make it versatile, specifically in space-constrained regions. Most delivery drones leverage these capabilities for obvious reasons.

fig1-drone-types.jpg
Figure 1. Type of drones and their applications

Motion & Navigation Systems in Drones​

Drones carry a multitude of sensors for motion and navigation, including accelerometers, gyroscopes and magnetometers (collectively referred to as an inertial measurement unit, or IMU), barometers and more. They use a variety of algorithms and techniques like optical flow (assisted with depth sensors), simultaneous localization and mapping (SLAM) and visual odometry. While these sensors perform their functions well, they can struggle to achieve the required accuracy and precision at affordable costs and optimal sizes. The issue is further aggravated during longer flight times, leading to the need for expensive batteries or limiting flight times based on battery charge cycles.

Vision Systems in Drones​

Image sensors supplement the above sensors with significant operational enhancements resulting in a high-accuracy, high-precision machine. These are available in two entities – Gimbals (often referred to as payloads as well) and Vision Navigation Systems (VNS).

Gimbals* – provide first person view (FPV); they generally constitute different types of image sensors spanning across the wide electromagnetic spectrum (ultraviolet in exceptional cases, regular CMOS image sensors over 300nm – 1000nm, short-wave infra-red (SWIR) sensors extending to 2000nm and beyond 2000nm with medium-wave (MWIR) and long-wave infra-red (LWIR) sensors.

Vision Navigation Systems (VNS) – provide navigation guidance, object and collision avoidance; they are generally made up of inexpensive low resolution image sensors and together with the IMU and sensors data, use computer vision techniques to create a comprehensive solution for autonomous navigation.

Vision Systems’ Importance​

Drones operate both in indoor and outdoor conditions as seen in usage and applications described earlier. These conditions can be significantly challenging with wide ranging lighting variances and visibility limitations in dust, fog, smoke, and pitch-black environments. These systems attempt to leverage significant artificial intelligence (AI) and machine learning (ML) algorithms applied over image data while using the assistance of the data provided by the techniques previously mentioned, all in the context of operating a highly optimized vehicle that consumes low power and delivers long range distance or extended flight time operations.

It is imperative that the data input into these algorithms is of high-fidelity and highly detailed, yet in certain usage cases, deliver just what is needed thus enabling efficient processing. Training times in AI/ML usage need to be short, and inference needs to be fast with high accuracy and precision. Images need to be of high quality no matter what environment the drone operates in to meet the above requirements.

Sensors that just capture the scene and present it for processing fall significantly short in enabling the high-quality operation of these machines that in most cases will void the very purpose of their deployment. The ability to scale down while still having full details in regions of interest, deliver wide dynamic range to address bright and dark lighting conditions in the same frame, minimize/remove any parasitic effects in images, address dust/fog/smoke filled view fields and assist these images with high depth resolution deliver tremendous benefits to making UAVs a highly optimized machine.

These capabilities minimize the magnitude of resources – processing cores, GPUs, on-chip or outside of the chip memory, bus architectures and power management – required in reconstructing and analysis of these images and hastening the decision-making process. This also reduces the BOM cost of the overall system, especially when we consider today’s UAVs easily can host more than 10 image sensors. Alternately, for the same set of resources, more analysis and complex algorithms to help effective decision making can be made possible thus making the UAV differentiated in this crowded field.

onsemi is the technology leader in sensing, contributing significant innovations to vision systems solutions and offering a comprehensive set of image sensors to address the needs of gimbals and VNS. The Hyperlux LP, Hyperlux LH, Hyperlux SG, Hyperlux ID and SWIR product families have incorporated considerable technologies and features that address the needs of drone vision systems exhaustively. Drone manufacturers can now obtain their vision sensors from a single American source that is NDAA compliant.

Learn more about onsemi image and depth sensors in the below resources:

* Gimbals technically refer to the mechanism that carries and stabilizes the specific payloads. However, often the combined assembly is called Gimbal.
 

Attachments

  • IMG_1256.jpeg
    IMG_1256.jpeg
    1 MB · Views: 44
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 27 users
View attachment 88521



Drones have become pervasive in entertainment (TV show/film making), hobbyist photography and just as a fun toy. They are increasingly used in inspection, logistics/delivery, security and surveillance, and other industrial use cases, due to their ability to access hard-to-reach areas. However, did you know that the most critical component enabling a drone’s operation is its vision system? Before diving deeper into this topic, let’s explore what drones are, their diverse applications, and why they have surged in popularity. Finally, we’ll discuss how onsemi is transforming the vision systems that power these incredible flying objects.

Types & Applications​

Drones are unmanned aerial vehicles (UAV), also called unmanned aerial systems (UAS), and to a lesser extent remotely piloted aircrafts (RPA). They can operate without the need to be driven and navigate autonomously using a variety of systems.

There are three types of drones – Fixed Wing, Single Rotor/Multi-Rotor and Hybrid. Each serves a different purpose, and each type is aligned with the intended purpose for which it is built.

Fixed Wings are typically used for heavier payload transport, longer flight times and are deployed in intelligence, surveillance and reconnaissance (ISR) missions, combat operations and loitering munitions, mapping and research activities to mention a few.

Single-/Multi-Rotors have the dominant usage, with a wide variety of industrial focuses that range from regular warehouses to inspections and even as delivery vehicles. The purpose of these types can be varied as they can be deployed in a wide variety of use cases, and demand highly optimized electro-mechanical solutions.

Hybrid Rotors incorporate the best of both types above, and have a vertical take-off and landing (VTOL) ability that make it versatile, specifically in space-constrained regions. Most delivery drones leverage these capabilities for obvious reasons.

fig1-drone-types.jpg
Figure 1. Type of drones and their applications

Motion & Navigation Systems in Drones​

Drones carry a multitude of sensors for motion and navigation, including accelerometers, gyroscopes and magnetometers (collectively referred to as an inertial measurement unit, or IMU), barometers and more. They use a variety of algorithms and techniques like optical flow (assisted with depth sensors), simultaneous localization and mapping (SLAM) and visual odometry. While these sensors perform their functions well, they can struggle to achieve the required accuracy and precision at affordable costs and optimal sizes. The issue is further aggravated during longer flight times, leading to the need for expensive batteries or limiting flight times based on battery charge cycles.

Vision Systems in Drones​

Image sensors supplement the above sensors with significant operational enhancements resulting in a high-accuracy, high-precision machine. These are available in two entities – Gimbals (often referred to as payloads as well) and Vision Navigation Systems (VNS).

Gimbals* – provide first person view (FPV); they generally constitute different types of image sensors spanning across the wide electromagnetic spectrum (ultraviolet in exceptional cases, regular CMOS image sensors over 300nm – 1000nm, short-wave infra-red (SWIR) sensors extending to 2000nm and beyond 2000nm with medium-wave (MWIR) and long-wave infra-red (LWIR) sensors.

Vision Navigation Systems (VNS) – provide navigation guidance, object and collision avoidance; they are generally made up of inexpensive low resolution image sensors and together with the IMU and sensors data, use computer vision techniques to create a comprehensive solution for autonomous navigation.

Vision Systems’ Importance​

Drones operate both in indoor and outdoor conditions as seen in usage and applications described earlier. These conditions can be significantly challenging with wide ranging lighting variances and visibility limitations in dust, fog, smoke, and pitch-black environments. These systems attempt to leverage significant artificial intelligence (AI) and machine learning (ML) algorithms applied over image data while using the assistance of the data provided by the techniques previously mentioned, all in the context of operating a highly optimized vehicle that consumes low power and delivers long range distance or extended flight time operations.

It is imperative that the data input into these algorithms is of high-fidelity and highly detailed, yet in certain usage cases, deliver just what is needed thus enabling efficient processing. Training times in AI/ML usage need to be short, and inference needs to be fast with high accuracy and precision. Images need to be of high quality no matter what environment the drone operates in to meet the above requirements.

Sensors that just capture the scene and present it for processing fall significantly short in enabling the high-quality operation of these machines that in most cases will void the very purpose of their deployment. The ability to scale down while still having full details in regions of interest, deliver wide dynamic range to address bright and dark lighting conditions in the same frame, minimize/remove any parasitic effects in images, address dust/fog/smoke filled view fields and assist these images with high depth resolution deliver tremendous benefits to making UAVs a highly optimized machine.

These capabilities minimize the magnitude of resources – processing cores, GPUs, on-chip or outside of the chip memory, bus architectures and power management – required in reconstructing and analysis of these images and hastening the decision-making process. This also reduces the BOM cost of the overall system, especially when we consider today’s UAVs easily can host more than 10 image sensors. Alternately, for the same set of resources, more analysis and complex algorithms to help effective decision making can be made possible thus making the UAV differentiated in this crowded field.

onsemi is the technology leader in sensing, contributing significant innovations to vision systems solutions and offering a comprehensive set of image sensors to address the needs of gimbals and VNS. The Hyperlux LP, Hyperlux LH, Hyperlux SG, Hyperlux ID and SWIR product families have incorporated considerable technologies and features that address the needs of drone vision systems exhaustively. Drone manufacturers can now obtain their vision sensors from a single American source that is NDAA compliant.

Learn more about onsemi image and depth sensors in the below resources:

* Gimbals technically refer to the mechanism that carries and stabilizes the specific payloads. However, often the combined assembly is called Gimbal.
It's a shame they don't like using the word neuromorphic compute in this document on Sensors 😞.
Always seem to be avoiding it like the plague with an alternative narrative 🙄.
 
  • Like
Reactions: 2 users

TECH

Regular


My personal message to our AKD family.............1000, 1.5, 2.0, Pico, M2, PCIe, Edgy Box and within the next 6/8 months AKD... 3.0
will be born. :ROFLMAO::ROFLMAO: Love U guys 💕 Tech.
 
  • Like
  • Love
  • Haha
Reactions: 19 users
Google response to onsemi Hyperlux
 

Attachments

  • Screenshot_20250715_115334_Google.jpg
    Screenshot_20250715_115334_Google.jpg
    452.7 KB · Views: 85
  • Like
  • Sad
Reactions: 3 users

Rskiff

Regular


My personal message to our AKD family.............1000, 1.5, 2.0, Pico, M2, PCIe, Edgy Box and within the next 6/8 months AKD... 3.0
will be born. :ROFLMAO::ROFLMAO: Love U guys 💕 Tech.

I sure want this and not end up like Amy
 
  • Haha
  • Like
Reactions: 10 users

7für7

Top 20
I believe we’ve now reached the point where, in the Edge-AI space, we’ve brought almost every relevant name to the table.

ARM. Intel Foundry. Mercedes. Edge Impulse. NVISO. NASA. GlobalFoundries. Now raytheon And yes – Dell.

What many might have missed: Dell Technologies already spoke very specifically about Akida in an official BrainChip podcast.

Rob Lincourt, Distinguished Engineer in Dell’s CTO Office, discussed its applications in Smart Home, Smart Health, Smart City, and Smart Transportation – and not as some generic PR fluff, but in a real tech-to-tech dialogue.

If you listen closely, it’s clear: Dell is watching. Carefully.


Well done, BrainChip. Truly.

Even though we (still) don’t have official unit numbers, customers, or design wins, one thing is clear:

The interest is there. The quality is real. The stage is set.

Honestly? I’m more confident than ever about what lies ahead.

Until then:
Patience. Trust. And a bit of faith in the invisible.
Because once the knot unravels, Akida will be everywhere.


And… I wouldn’t be 7 if I didn’t add a little self-ironic closing line:


Maybe we’ll hear something soon? An announcement, perhaps?
“Unquoted Securities” hidden in a side note?

Or maybe, just maybe… an update that doesn’t start with a Broadway quote? 😄
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 15 users

7für7

Top 20
Google response to onsemi Hyperlux


Google AI knows a shi… about Brainchip


spit GIF


This is what I get from legendary ChatGPT

Is Akida already along for the ride – possibly inside onsemi’s Hyperlux?

Hyperlux is a next-gen image sensor platform for automotive and edge use cases – like ADAS cameras, driver monitoring, or autonomous vision systems.

What stands out: the system is clearly designed for edge intelligence, with a strong focus on low power, local processing, and smart event triggering.

So the question arises:

Could Akida already be integrated quietly behind the scenes – assisting with object classification, event detection, or preprocessing?

👉 Technically, it’s entirely realistic:

Akida is ultra-compact, energy-efficient, and built for spiking-based visual processing at the edge.

And onsemi is targeting exactly the same segment – intelligent, vision-capable edge sensing for automotive systems.

But why haven’t we heard anything?

Simple:
In the world of B2B tech – especially in chip IP or subsystem integrations – discretion is standard.
If Akida were licensed as an IP block inside a sensor module (e.g., for a specific OEM use case),

➡️ there would be no obligation to announce it, unless it’s material to public shareholders.

Not every chip that’s running makes it into a press release.

Especially if it’s “just” a component in a larger SoC or sensor stack.

Bottom line:

I’m not saying Akida is inside Hyperlux – but I’m not saying it isn’t either.

In a space defined by modularity, NDAs, and silent design-ins, it’s entirely possible we’re already embedded – just incognito.

The real question might not be:
“Where is Akida?…but rather:
“Where has Akida already been deployed – and no one’s talking about it?”

MOO DYOR
 
Last edited:
  • Like
  • Fire
  • Wow
Reactions: 6 users

Cardpro

Regular
Quarterly report due soon - I am hoping for a miracle!
 
  • Like
  • Haha
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
On Sunday @Fullmoonfever discovered Fernando Sevilla Martínez's (Volkswagen) Github activity demonstrating Akida-powered neuromorphic processing for V2X. This morning I noticed that Valeo have just unveiled (a few days ago) a real-road demonstration of its 5G-V2X direct technology.

There's no direct mention of neuromorphic or BrainChip in Valeo’s announcement, but it's pretty coincidental, especially noting that Volkswagen Group is collaborating with Valeo to upgrade the advanced driver assistance systems and we've been partnered with Valeo for quite some time.

Valeo’s V2X platform heavily relies on AI-driven sensor fusion with ultra-low latency - exactly where BrainChip’s Akida + TENNs would excel.





Smart Technology For Smarter mobility


  • Valeo V2X Technology: Protecting Vulnerable Road Users
Valeo Group | 10 Jul, 2025 | 6 min

Valeo V2X Technology: Protecting Vulnerable Road Users​


Valeo unveiled at the 5G Automotive Association (5GAA) conference in Paris the world's first live, real-road demonstration of its 5G-V2X direct technology. This demo represents a major step toward safer, smarter, and more connected mobility.

Road traffic injuries are a leading cause of death and disability worldwide. According to the World Health Organization, nearly 1.2 million people are killed and as many as 50 million people injured each year. Globally, more than 1 in 4 road-accident deaths involve pedestrians and cyclists.

Urban traffic has become denser and more complex with an increasing number of vehicles, bikes, buses, cyclists and pedestrians. Intersections, in particular, can be hazardous. Dangerous situations in which vulnerable road users (VRUs) are hidden by a bus or vehicle can potentially cause accidents between vehicles and VRUs.
Built to enhance road safety, traffic efficiency, and the capabilities of autonomous vehicles, Valeo’s V2X platform enables real-time communication between vehicles and everything around them, including infrastructure, pedestrians, and the broader network.
The world-premiere demonstration highlights how technology can actively prevent collisions, reduce congestion, and protect vulnerable road users.
Vehicle-to-Everything (V2X) is a transformative communication technology that enables vehicles to interact with their environment. Sensor sharing or collaborative perception, one of the main V2X advanced use cases, allows vehicles to exchange data in real time through V2X systems in order to foresee and help prevent potential hazards, optimize traffic flow, and support high-level autonomous driving systems.
A world leader in ADAS systems, Valeo’s complete solution is powered by its ADAS sensor suite, connectivity hardware, and AI-driven software. These components work together to enable instantaneous data exchange with ultra-low latency, crucial for life-saving decisions on the road.
For example, two vehicles are approaching an intersection from different directions. One vehicle’s sensors observe the presence of a pedestrian crossing the street or a cyclist that’s not in view of the other vehicle. The vehicle can inform the second vehicle of the presence of the VRU so that the driver of the L2/L2+ vehicle or the autonomous vehicle can react accordingly.
V2X_banner.jpg

Key functions of the V2X platform include:
  • Collision warning and emergency braking alerts
  • Smart traffic light and intersection coordination
  • Support for autonomous vehicles
V2X plays a pivotal role in the future of autonomous driving, particularly in scenarios where traditional sensors reach their limits. V2X extends a vehicle’s awareness beyond line-of-sight to detect hazards or vulnerable road users even before they are visible.
Valeo’s V2X solutions are designed to integrate with smart city infrastructure, helping cities meet their goals for, Vision Zero (eliminating all road fatalities and severe injuries), reduced CO2 emissions, and efficient urban mobility.
From safer roads to smarter cities, Valeo’s V2X technology is helping shape the future of transportation—where vehicles not only drive, but also think, communicate, and protect.

 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 29 users

FJ-215

Regular
Slow trading this arvo.

1752553686555.png
 
  • Like
  • Fire
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Google AI knows a shi… about Brainchip


spit GIF


This is what I get from legendary ChatGPT

Is Akida already along for the ride – possibly inside onsemi’s Hyperlux?

Hyperlux is a next-gen image sensor platform for automotive and edge use cases – like ADAS cameras, driver monitoring, or autonomous vision systems.

What stands out: the system is clearly designed for edge intelligence, with a strong focus on low power, local processing, and smart event triggering.

So the question arises:

Could Akida already be integrated quietly behind the scenes – assisting with object classification, event detection, or preprocessing?

👉 Technically, it’s entirely realistic:

Akida is ultra-compact, energy-efficient, and built for spiking-based visual processing at the edge.

And onsemi is targeting exactly the same segment – intelligent, vision-capable edge sensing for automotive systems.

But why haven’t we heard anything?

Simple:
In the world of B2B tech – especially in chip IP or subsystem integrations – discretion is standard.
If Akida were licensed as an IP block inside a sensor module (e.g., for a specific OEM use case),

➡️ there would be no obligation to announce it, unless it’s material to public shareholders.

Not every chip that’s running makes it into a press release.

Especially if it’s “just” a component in a larger SoC or sensor stack.

Bottom line:

I’m not saying Akida is inside Hyperlux – but I’m not saying it isn’t either.

In a space defined by modularity, NDAs, and silent design-ins, it’s entirely possible we’re already embedded – just incognito.

The real question might not be:
“Where is Akida?…but rather:
“Where has Akida already been deployed – and no one’s talking about it?”

MOO DYOR


Hi @7für7,

This may be of interest in regards to Onsemi's Hyperlux.


Sounds very promising @TheUnfairAdvantage, since we have already integrated our hardware with Onsemi's AF0130 smart iToF sensor one year ago.


View attachment 79028







View attachment 79029


EXTRACT FROM THE ABOVE ARTICLE

View attachment 79031



 
  • Like
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Slow trading this arvo.

View attachment 88525


Crikey, big spender alert!

Someone’s gone and lashed out on one whole share. Yep, really emptied their wallet there.

At this rate they'll be able to retire in the year 3079. 😂💰



Screenshot 2025-07-15 at 2.39.32 pm.png




Screenshot 2025-07-15 at 2.34.54 pm.png
 
  • Haha
  • Like
Reactions: 10 users

7für7

Top 20

Diogenese

Top 20
Crikey, big spender alert!

Someone’s gone and lashed out on one whole share. Yep, really emptied their wallet there.

At this rate they'll be able to retire in the year 3079. 😂💰



View attachment 88527



View attachment 88526
Yes, but they sold it 6 minutes later for a profit of $0.002.

Thats 2 cents per hour. Nice work if you can get it!
 
  • Haha
  • Like
Reactions: 14 users

jrp173

Regular
New linkedin post...

1752559633470.png
 
  • Like
  • Fire
  • Love
Reactions: 16 users

Harwig

Regular
Now this is interesting... Possibly us????
 
  • Fire
  • Like
Reactions: 4 users
Quack Quack
 


My personal message to our AKD family.............1000, 1.5, 2.0, Pico, M2, PCIe, Edgy Box and within the next 6/8 months AKD... 3.0
will be born. :ROFLMAO::ROFLMAO: Love U guys 💕 Tech.

Actually sad listening to that song, knowing she died at such a young age..
Such a hauntingly beautiful voice..

I do hate the fact that she has probably help popularise tattoos, metallic zits and overdone makeup and eyelashes..

But eh..

She did have an inherent unconventional natural beauty, underneath all that.


"Some" kind of announcement has to be close now surely 🤔..
 
  • Like
  • Love
  • Fire
Reactions: 7 users
Quarterly report due soon - I am hoping for a miracle!
Shouldnt be long before the shit from the crapper start filling this forum with crap.

I’m missing them

1752565282031.gif
 
  • Haha
Reactions: 7 users

Esq.111

Fascinatingly Intuitive.
Now this is interesting... Possibly us????
Good Evening Harwig ,

Agree , saw this several hour's ago & thought ........ well f%$k me spinning , if our tech could not be integrated in the back end ... image classification / verdict produced instantaneously, alleviating any operator guestimating.

Like minds think alike.

Regards,
Esq.
 
  • Like
  • Fire
Reactions: 4 users
Top Bottom