BRN Discussion Ongoing

Rach2512

Regular
Thoughts?

 
  • Fire
  • Like
  • Thinking
Reactions: 7 users

Fiendish

Regular
  • Like
Reactions: 3 users
Thoughts?


1764656487753.gif
 
  • Haha
Reactions: 9 users
 
  • Like
  • Wow
Reactions: 5 users
Because the market has heard it all before, and no longer believes the bullshit from BRN.

The market wants hard evidence, not just "watch us" bullshit...
Oh wow your upset
 

TECH

Regular
What I don’t get Tech is that all this information that has been collected by all these amazing people here an on the other place is truly incredible, but yet we have so little to show for it.
Dry few company releases very poor information from BrainChip and a very low share price

One has to have to be slightly crazy to stay invested like I am up to my armpits in this company thinking that one day it might actually be a good investment

It it gets incredibly frustrating
Just saying

I hear you Food, yes, it's frustrating and any 5/10 or beyond years holders would be lying if they didn't admit to
some level of frustration, some may even use different words or expressions, but keep focusing on the end game,
we are and have progressed a tremendous amount over the last 3.5 years.

To deny that we aren't structured any better; to deny that we aren't engaged with some super big players, who
are the real gatekeepers of how their business models will unfold, their timelines, answering to their clients and
stakeholders, to me it's like a big pack of dominoes (not the pizza variety) :ROFLMAO: and we are at different points
in time all interconnected, it takes many tech companies coming together to deliver the ultimate product, well
in most cases, from my understanding.

Can you imagine the level of excitement that will be exploding from within the walls of our US headquarters
when we really hit solid paydirt, all our scientists, PhDs, all our top-class engineers and office admin staff etc
will be feeling the overwhelming joy of real success, and I think they would have earnt it quietly, try for a
minute to appreciate what they have achieved to date; it's a lot more than many give our staff credit for.

Yes, we want real revenue streams, and we want them yesterday, don't you think our staff want the same thing,
of course they do.

Hang tight is my suggestion, our tech is brilliant.

💘 AKD
 
  • Like
  • Fire
  • Love
Reactions: 30 users

manny100

Top 20
Brainchip's future looks 'baked in'.
Exploring the Future of Neuromorphic Computing at the Edge
" According to Gartner, traditional computing technologies will hit a digital wall in 2025 and force a shift to new strategies, including those involving neuromorphic computing."
I do not think that the above would surprise anyone.
From the Gartner link above.

"Traditional technology tanks​

By 2025, traditional computing technologies will hit a digital wall, forcing the shift to new computing paradigms such as neuromorphic computing."
 
  • Like
  • Fire
Reactions: 14 users

manny100

Top 20
I hear you Food, yes, it's frustrating and any 5/10 or beyond years holders would be lying if they didn't admit to
some level of frustration, some may even use different words or expressions, but keep focusing on the end game,
we are and have progressed a tremendous amount over the last 3.5 years.

To deny that we aren't structured any better; to deny that we aren't engaged with some super big players, who
are the real gatekeepers of how their business models will unfold, their timelines, answering to their clients and
stakeholders, to me it's like a big pack of dominoes (not the pizza variety) :ROFLMAO: and we are at different points
in time all interconnected, it takes many tech companies coming together to deliver the ultimate product, well
in most cases, from my understanding.

Can you imagine the level of excitement that will be exploding from within the walls of our US headquarters
when we really hit solid paydirt, all our scientists, PhDs, all our top-class engineers and office admin staff etc
will be feeling the overwhelming joy of real success, and I think they would have earnt it quietly, try for a
minute to appreciate what they have achieved to date; it's a lot more than many give our staff credit for.

Yes, we want real revenue streams, and we want them yesterday, don't you think our staff want the same thing,
of course they do.

Hang tight is my suggestion, our tech is brilliant.

💘 AKD
Agree, the DOD is looking good.
" Neuromorphic processing power will unlock new capabilities in pattern recognition and autonomous tasking."
Apart from what Bascom, RTX/USAFRL and Parsons are doing the DOD will also be huge in the Space arena - an area we are already proven in.
" The center is tasked with improving the Navy’s autonomous systems in space and is investigating applications in this technology — what’s called neuromorphic processors. The technology would offer major opportunities for AI and allow the Defense Department to advance pattern recognition, autonomous tasking, and reduction in energy and latency, all of which are critical for robotic applications where there is a need for a robot to immediately respond to constant, real-world change."
" “We [the satellite community] are interested in using AI to control robots and other autonomous systems,” Steven Meier, the center’s director, told GovCIO Media and Research on GovCast. “This is a challenge, though, because AI and [machine learning] types of systems tend to be very brittle. They break, often catastrophically, when encountering situations it wasn’t designed for.”"
 
  • Like
Reactions: 12 users
Has anyone heard of AIM Future:



"ACL Digital, an ALTEN group company and a global leader in design-led digital transformation, enterprise modernization, and product engineering services, today announced a strategic partnership with AIM Future, a leader in neuromorphic AI processors. The collaboration aims to develop and deliver cutting-edge AI solutions across diverse industries."

"ACL Digital and AIM Future will co-develop advanced use cases, including:

Intelligent surveillance and security systems
Real-time industrial automation and predictive maintenance

Smart healthcare devices with on-device diagnostics

Autonomous mobility solutions for transportation and logistics

Energy-efficient consumer and industrial IoT applications"
 
  • Thinking
  • Like
  • Wow
Reactions: 4 users

Frangipani

Top 20
A new strategic partner: Montreal-based HaiLa Technologies (https://www.haila.io/), an innovator in ultra-low power wireless connectivity.

“Together, the companies will demonstrate how BrainChip’s Akida neuromorphic technology pairs seamlessly with HaiLa’s BSC2000 radio frequency integrated circuit RFIC to enable breakthrough power efficiency for connected sensor applications in different markets.

The combined technologies produce an ultra-efficient architecture that paves the way for continuously connected battery-operated devices that can last the entire life of the product on a single coin cell battery. This joint demonstration leverages HaiLa’s hyper power-efficient passive backscatter wireless communication over standard Wi-Fi infrastructure with BrainChip’s Akida AKD1500 event-based AI processor. The integration provides a unique platform for anomaly detection, condition monitoring, and other sensor-intelligence tasks while operating on just microwatts of power.”


That strategic collaboration was likely supposed to be revealed later today, on the first day of Sensors Converge, which will be taking place at the Santa Clara Convention Center (24-26 June) - audioexpress.com appears to have been a tad too early in publishing the news… 😊


BrainChip and HaiLa Demonstrate Ultra-Low Power Edge AI for Connected Sensor Applications​

June 24 2025, 05:10

BrainChip Holdings, a pioneer in ultra-low power, neuromorphic processors, announced a strategic collaboration with HaiLa Technologies, an innovator in ultra-low power wireless connectivity. The partnership aims to deliver power optimized, smart edge AI devices with integrated sensor intelligence, connectivity, and computation. BrainChip and HaiLa are working with leading OEMs and ecosystem partners to bring their combined capabilities to multiple markets.
https%3A%2F%2Faudioxpress.com%2Fassets%2Fupload%2Fimages%2F1%2F20250624073431_Brainchip-HaiLa-TWeb.jpg

Together, the companies will demonstrate how BrainChip’s Akida neuromorphic technology pairs seamlessly with HaiLa’s BSC2000 radio frequency integrated circuit RFIC to enable breakthrough power efficiency for connected sensor applications in different markets.

The combined technologies produce an ultra-efficient architecture that paves the way for continuously connected battery-operated devices that can last the entire life of the product on a single coin cell battery. This joint demonstration leverages HaiLa’s hyper power-efficient passive backscatter wireless communication over standard Wi-Fi infrastructure with BrainChip’s Akida AKD1500 event-based AI processor. The integration provides a unique platform for anomaly detection, condition monitoring, and other sensor-intelligence tasks while operating on just microwatts of power.

BrainChip and HaiLa are teaming up to deliver smarter, ultra-low-power solutions for intelligent connected edge devices, making it easier to run AI at the edge without draining battery life. HaiLa’s BSC2000 is a Wi-Fi-compatible connectivity RFIC designed to showcase extreme power savings in IoT environments. When paired with Akida’s energy-efficient, event-driven AI compute, the result is a uniquely optimized approach.


https%3A%2F%2Faudioxpress.com%2Fassets%2Fupload%2Fimages%2F1%2F20250624073431_Brainchip-onDeviceAI-TWeb.jpg

“As a pioneer in neuromorphic computing, we are excited to partner with HaiLa to demonstrate how advanced low-power AI processing can work in tandem with ultra-efficient wireless connectivity,” says Steve Brightfield, CMO at BrainChip. “By combining our Akida technology with HaiLa’s innovative RF platform, we’re making intelligent, battery-powered edge sensors a practical reality.”

“Our collaboration with BrainChip brings together two power-conscious technologies that redefine what is possible at the edge,” adds Patricia Bower, Vice President of Product Management at HaiLa. “With backscatter Wi-Fi and neuromorphic AI operating on microwatts, developers can create continuously monitored, intelligent sensors that last for years without battery replacement. This is transformative for anomaly detection, predictive maintenance, and other real-time sensing applications.”

Founded in 2019, HaiLa is a fabless semiconductor and software company developing low-power multi-protocol radio communication for connected devices. Originally conceptualized at Stanford University, HaiLa enables pervasive edge AI and the scaling of battery-free IoT by offering extremely power-efficient wireless connectivity on industry standard wireless protocols like Wi-Fi, Bluetooth, and cellular. HaiLa solutions deliver unparalleled cost savings in battery maintenance, the ability to leverage existing wireless infrastructure, and support for efficient communication of locally inferenced AI/ML data directly on the device. Applications are focused on consumer electronics, smart buildings and logistics spaces, along with mobile, manufacturing, transportation, and medical markets.
https%3A%2F%2Faudioxpress.com%2Fassets%2Fupload%2Fimages%2F1%2F20250624073431_Brainchip-AkidaSlogan-TWeb.jpg

BrainChip is leading developer of Edge AI on-chip processing and learning. The company’s Akida processor is the first commercial, fully digital, event-based AI solution that mimics the way the brain analyzes data — processing only essential inputs with unmatched efficiency and speed. Akida supports Edge learning directly on the chip, without the need for cloud connectivity, providing significant advantages in latency, privacy, and energy consumption. Akida IP is ideal for integration into SoCs used in a wide range of real-world applications, from connected vehicles and consumer electronics to industrial automation and IoT sensors.

BrainChip (booth #919) and HaiLa (booth #738) are exhibiting at Sensors Converge 2025, from June 24-26 in Santa Clara, California.
www.haila.io | www.brainchip.com




View attachment 87564

View attachment 87565



View attachment 87561


HaiLa’s Founder and Chief Innovation Officer:


View attachment 87562

View attachment 87563


I wonder if our initial contact may have been through their partner UW (University of Washington) in Seattle. The UW Department of Electrical and Computer Engineering has been happily sending us summer interns for the past few years, and some of those decided to stay for longer: Both FNU Sidharth (who recently left our company to pursue a PhD at the University of Michigan) and Ritik Shrivastava (who is still working for us as an ML Solutions Architect) are UW alumni.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-459653


View attachment 87567
Another highly entertaining eejournal.com article featuring BrainChip by Max Maxfield:


max-0424-image-for-home-page-akida.jpg

October 9, 2025

Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront​

by Max Maxfield



max-0438-image-for-home-page-haila.jpg

December 2, 2025

AI at the Edge? Meet Wi-Fi in Microwatts​

by Max Maxfield

I’ve just been introduced to a technology that has the potential to revolutionize battery- and ambient-powered AI-enabled sensors using Bluetooth or WiFi while consuming only 1/1000th the power of regular Bluetooth and WiFi implementations. Intrigued? Read on!

Before we plunge into the fray with gusto and abandon, indulge me, if you will. I read a lot of science fiction. One of the books I read recently stuck in my mind more than most. I’m talking about The Last Human by Zack Jordan. This is packed with myriad mindboggling ideas, all competing for the reader’s attention. The story is set in a far future dominated by “The Network.” This is a galaxy-spanning collective of millions of alien species connected through implants, sharing knowledge, culture, and peaceful coexistence.

The Network is an always-on super-AI-based information and governance layer that connects trillions of beings—biological and artificial—into a shared ecosystem. Every species and individual has an associated “tier” (a rating of intelligence/capability).
  • Tier 1 = barely sapient
  • Tier 2 & 3 = normal biological species
  • Tier 4 = superintelligences / AIs
  • Tier 5 = near-godlike entities
This tier system isn’t a neat numerical scale (although words like “exponential” and “logarithmic” do spring to mind), but rather a set of huge qualitative leaps in cognition, where each tier represents an entirely new class of mind rather than a simple multiplier.

Tier 4 beings and above—superintelligent artificial minds—don’t just use the Network; they live inside it. They link through high-bandwidth direct interfaces, shared computation layers, distributed processing, and collective awareness zones. Tier 4 minds can jointly reason across vast distances, spawn sub-processes, merge or split, run simulations, and influence the lower tiers indirectly through “guidance.”

Their thinking is faster than biological time, so from their perspective, the whole galaxy is a sort of real-time distributed AI environment. The AIs essentially govern the galaxy, though always under the Network’s overarching rules, and even they are constrained by the Network’s original safety protocols.

Biological species like our hero (the last human) do not think at the speed required to interact directly with the Network. They must use implants. For most species, this implant is installed early in life. It acts as a gateway between slow biological minds and the high-speed realm of the Network. Every citizen has a brain implant (or equivalent biological connector) that provides identity (the user’s presence on the Network authenticated at the neurological level), sensory overlays, translation, messaging, augmented reality (AR) guidance, behavioral nudges, and safety protocols.

The implant also includes a local, personal companion AI that converses with its user. This AI filters incoming Network data so you aren’t overwhelmed, prioritizes messages, translates context into your species’ cognition, and prevents dangerous sensory overload. It’s a bit like a hyper-advanced Siri/Cortana. It helps you recall memories, schedule things, guide your movement through AR, and “nudges” you when you’re about to do something suboptimal. It also behaves like what we currently think of as agentic AI, in that you can ask it to perform a task, such as locating specific information, and it will dispatch AI agents to perform your bidding.

Now, this is where things get interesting (and where we make a desperate attempt to connect to the core of this column). In the Network, everything that’s sentient enough to make a decision counts as a “mind,” no matter how tiny. This includes things such as doors, toilets, trash chutes, cleaning bots, ventilation controllers, safety monitors, light switches, and minor repair drones.

These micro-intelligences are often Tier 0.1 to Tier 0.2, which is barely above instinct. Think of a simple reflex loop with a personality veneer. Most are roughly equivalent to a loyal dog, a Roomba, or a VERY obedient child. They have enough cognition to perform their function, detect anomalies, follow Network safety laws, recognize a being’s tier, communicate danger, and ask for help when overwhelmed, but not enough to have ambitions or complex emotions.

However, because the Network demands universal safety and predictability, even these micro-intelligences get a unique identity, a sliver of cognition, a connection to the Network, obligations to follow rules, and the ability to “complain” if they feel they’re being mistreated. All this results in a universe where everything is watching, reporting, and quietly thinking.

I can easily envisage living in a world surrounded by intelligent sensors. In many cases, I’d welcome it (I’m thinking about all the water leaks, HVAC issues, and humidity-in-crawlspace problems we’ve been fighting in our house recently).
I think we are heading toward such a world, one shuffling step at a time. It seems like every day we hear about advances in software space (where no one can hear you scream), such as large language models (LLMs), generative AI, and agentic AI. We also hear about developments on the hardware side, including incredibly low-power inference engines that can be deployed in battery- or ambient-powered nodes.

I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI).

What we are talking about here is the ability to perform inference operations at the edge with only microwatts of power. These sensors and inference engines employ all sorts of tricks, like spending most of their time in sleep mode, waking only occasionally to take a reading and report back to “head office” via a local router.

This all sounds great if you speak quickly and gesticulate furiously. What I’ve never really considered is the power budget associated with the wireless communication component. A quick “back of the envelope” ponder reveals that if we assume an AI inference in the 10–100 microwatts (10^⁻⁵ to 10^⁻⁴ W) range, then a Bluetooth radio transmission of 10–50 milliwatts (10^⁻² to 5*10^⁻² W) consumes 1,000X to 5,000X more power than the on-sensor AI. Similarly, a Wi-Fi transmission of 200 to 1,000+ milliwatts (2*10^⁻¹ to 1 W) consumes ~10,000 to 1,000,000 times more power than the on-sensor AI.

This power gap is ridiculous
. It makes my eyes water just to think about it (I’m not crying; I just have something in my eye).

All of which leads us to the point of this column (yes, of course there’s a point; did you ever doubt me?). I was just chatting with Patricia Bower, VP of Product Management at HaiLa Technologies.

After the customary pleasantries (everyone seems to love my Hawaiian shirts), the first thing Patricia told me was, “We’re a radio semiconductor company.” The second thing she told me is that the real barrier to ubiquitous IoT isn’t sensors, processors, or even the AI—it’s the cost of pushing bits through the air. Although she didn’t use these exact words, the gist was that “Radios are the long pole in the IoT tent, and HaiLa’s mission is to shrink that pole down to the size of a cocktail toothpick.”

HailLa’s secret sauce is a clever combination of bi-static passive backscattering and ultra-low-power active transmit, applied not to exotic new protocols but to the existing global wireless workhorses: Bluetooth Low Energy (BLE) and Wi-Fi. These technologies are everywhere—phones, laptops, access points—so instead of reinventing the wireless stack, HaiLa found a way to “ride on top of it” while using almost no energy.

The best analogy—and the one HaiLa themselves use—is the laser-and-mirror story. Generating a laser beam is expensive in terms of power, but holding up a mirror and wiggling it to modulate the beam as it passes by is practically free. Traditional radios are the “lasers” in our analogy. They generate power-hungry RF waveforms. HaiLa’s devices behave more like “mirrors,” reflecting and subtly modulating existing Wi-Fi or BLE signals as they zip by.


This is the essence of passive backscatter: the HaiLa device doesn’t synthesize its own carrier; it simply changes the impedance of its antenna so that the reflected signal carries new information. It’s the same principle used in RFID, except HaiLa adapts it to mainstream protocols like Wi-Fi and BLE, which is a giant leap forward because RFID requires special readers, whereas Wi-Fi and BLE devices are already everywhere.


lg.php


max-0438-01-haila-backscatter-technology.png

HaiLa passive backscattering on Wi-Fi (Source: HaiLa)

HaiLa’s flagship technique is bi-static passive backscattering on Wi-Fi. One Wi-Fi device transmits a “blank” frame, and another Wi-Fi device receives the reflected frame. The HaiLa-equipped sensor simply detects a particular waveform from the 802.11 spec (specifically the old 1–2 Mbps DSSS format that still exists in all modern access points), embeds sensor data into the frame, and reflects it toward a different receiver.

Because the reflected signal is extremely weak, HaiLa uses a clever 50MHz channel shift to move the backscattered frame away from the original downlink channel. This provides the receiver with sufficient adjacent-channel rejection to detect the whisper-quiet reflection. None of this requires HaiLa to generate its own RF carrier, which is why power consumption is microscopic.

With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~5µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy.


Although the backscattering technique is awesome, some applications—think security sensors or event-driven alerts—need the device to speak first, not wait for a poll. For these devices, HaiLa has developed an extremely low-power active transmit mode. This still uses their custom PHY and protocol optimizations but generates a real RF signal, enabling asynchronous uplink for mere microwatts rather than milliwatts.

HaiLa’s production chip includes a RISC-V MCU, a hardware crypto accelerator (AES-128/256) for secure payloads, and flexible interfaces (SPI, I²C, analog). They’re also contributing heavily to the next IEEE 802.11 task group for Ambient IoT, which will eventually make these capabilities native to future Wi-Fi standards.

When radios no longer dominate the power budget, we will finally be in a position to deploy billions of sensors that run for decades—or forever—without battery replacement. Smart buildings, wearables, medical patches, smart cars, asset tags, and entirely new sensing categories will become feasible.

HaiLa isn’t just shrinking power consumption; they’re shrinking the cost of connecting the physical world. I think I’m ready for my implant now. How about you?
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 26 users

Frangipani

Top 20
A paper titled “A Bio-Inspired Hierarchical Temporal Defense for Securing Spiking Neural Networks Against Physical and Adversarial Perturbations” by London-based AI and Space expert Sylvester Kaczmarek (affiliated with Imperial College London’s Department of Computing; he is also the Founder & CEO of stealth-mode startup OrbiSky) was accepted for the NeurIPS 2025 workshop “Machine Learning and the Physical Sciences”, which will take place later this week (https://ml4physicalsciences.github.io/2025/).

The author used Akida hardware for inference.

His stealth-mode startup OrbiSky’s mission is “Transforming Aerospace with Autonomous AI”.


6B7B8C01-194C-4713-A0E1-0116ADBD4F81.jpeg
FAF47077-54CE-41F4-A040-562C2624FD9A.jpeg






E7FA94BC-6429-4705-A256-6D74B02118AC.jpeg



5B9E24AD-3405-45E3-A6E3-58B2CD2B9929.jpeg




746678A2-63E4-4F07-9704-939C10665F95.jpeg


525BBC66-A918-42DB-9A23-BF67F7A03B17.jpeg




C262A1D1-50E6-4372-866C-52BAB84CF309.jpeg




5706190E-EAA9-4A41-86E2-CEA1A0941A2C.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 27 users

Frangipani

Top 20
Lyes Khacef’s former colleague Andreas Ziegler, who was a research scientist intern at SonyAI Switzerland from November 2023 to March 2024, has verifiably got first-hand experience with AKD1000 - remember that video of a Tübingen University robot playing table tennis? Andreas Ziegler (who is doing a PhD in Robotics and Computer Vision at Uni Tübingen) was most likely the video’s speaker (given his distinct Swiss German accent in English) as well as the accompanying research paper’s first author (published March 15, 2024):


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900

View attachment 64676

View attachment 64677

A video going along with that paper was uploaded to YouTube yesterday:



Both paper and video relate to another paper and video published by the same Uni Tübingen authors earlier this year. At a cursory glance, at least the videos (posted about six months apart) appear to be VERY similar:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900


View attachment 70372
View attachment 70373


Now compare the slides to those in the video uploaded October 3:

View attachment 70368


View attachment 70369

View attachment 70370

In fact, when I just tried to cursorily compare the new paper to the March 15 paper that @Fullmoonfever had linked at the time (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416313), I discovered that the link he had posted then now connects directly to this new paper, published on September 16, so it seems to be an updated version of the previous paper.

I did notice the addition of another co-author, though: Sebastian Otte, who used to be a PhD student and postdoc at Uni Tübingen (2013-2023) and became Professor at Uni Lübeck’s Institute for Robotics and Cognitive Systems just over a year ago, where he heads the Adaptive AI research group.

0d00f748-f1ff-44f9-be7c-849d5e0b8583-jpeg.70378



To put the results that our competitors’ neuromorphic offerings fared worse in the benchmarking tests alongside Akida somewhat into perspective:
In all fairness, it should be highlighted that Akida’s superiority was at least partly due to the fact that AKD1000 is available as a PCIe Board, whereas SynSense’s DynapCNN was connected to the PC via USB and - as the excerpt Gazzafish already posted shows - the researchers did not have direct access to a Loihi 2 edge device, but merely through a virtual machine provided by Intel via their Neuromorphic Research Cloud. The benchmarking would obviously yield better comparable results if the actual hardware used were of a similar form factor:

“Our results show that the better a neuromorphic edge device is connected to the main compute unit, e.g., as a PCIe card, the better the overall run-time.”


Anyway, Akida undoubtedly impressed the researchers, and as a result they are considering further experiments: “(…) future work could involve evaluating the system with an additional Akida PCIe card.”


View attachment 70374


In an earlier post (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-426404), I had already mentioned that the paper’s first author, Andreas Ziegler, who is doing a PhD in robotics and computer vision at Uni Tübingen, has meanwhile completed his internship at Sony AI in Switzerland (that - as we know - partially funded the paper’s research):

View attachment 70375


Fun fact: One of his co-authors, Karl Vetter, however, is no longer with Uni Tübingen’s Cognitive Systems Lab, but has since moved to France, where he has been working as a research engineer for…

🥁 🥁 🥁 Neurobus for the past three months!
It’s a small world, isn’t it?! 😉

View attachment 70376
View attachment 70377



We’ve known since March 2024 that Sony AI (Switzerland) partially funded research done at Uni Tübingen that involved event-cameras and neuromorphic processors. Remember the video with the table tennis-playing robot arm?

The associated paper titled “Detection of Fast-Moving Objects with Neuromorphic Hardware” - first published in March 2024 and revised in September 2024 - described experiments with an event-based camera and three neuromorphic processors: Akida, DynapSNN (SynSense) and Loihi.

The paper’s first author was Andreas Ziegler, a PhD Candidate in Robotics and Computer Vision in collaboration with the Cognitive Systems Group at the University of Tübingen and Sony AI. From November 2023 to March 2024 he was a research intern at Sony AI’s Europe office in Schlieren near Zurich.

Paper and video can be found here:


Akida came out top in the benchmarking at the time - however, the Sony AI researchers emphasised that the neuromorphic processors’ different form factors had also influenced the results. I commented on this last October:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-437994

EE1C9530-D437-4A10-89DF-5E3C0D7FA168.jpeg




So does the fact that Akida fared best in the above benchmarking mean that Sony now only has eyes for BrainChip with regard to neuromorphic computing and is no longer interested in what our competitors have to offer?

Not at all.

Sony researchers based in Switzerland and Japan have just published two papers about a prototype resp. proof-of-concept utilising neuromorphic hardware developed by SynSense (“Realizing Fully-Integrated, Low-Power, Event-Based Pupil Tracking with Neuromorphic Hardware”, based on the commercially available Speck SoC) resp. Intel (“Privacy-preserving fall detection at the edge using Sony IMX636 event-based vision sensor and Intel Loihi 2 neuromorphic processor”):



016A0E47-710E-4C4F-B4F7-61C0B95D335E.jpeg




698095CF-B840-410B-BFBB-4C6E2861F8AB.jpeg


8663F133-3BBD-4778-A7DA-FBAA5FEB4A78.jpeg



57179092-6017-4ABB-AD86-2B220AEE302A.jpeg

C4819567-B996-433C-86F4-6A6539CF266A.jpeg





FF55C5D2-06FC-4C3F-AAD1-E6DD36065118.jpeg
C71D85C6-12FA-47FB-BB65-9963B16AE08B.jpeg


Even Alf Kuchenbuch gave Lyes Khacef’s post a 👍🏻 (who has in turn liked a number of BrainChip posts over the years).

Also note the comment by Mike Davies.



B6B967B7-E916-494E-BE6E-657AEB15DA1C.jpeg
5A12FAF1-3407-4B55-8FC1-EF1C0EC8065E.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 12 users
Interesting to know what Mr Lewis refers to when he mentions ....LLM on your wristwatch in his latest blog.
 
  • Like
Reactions: 6 users
Interesting to know what Mr Lewis refers to when he mentions ....LLM on your wristwatch in his latest blog.
Maybe it's this...

FF
https://www.embedded.com/ultra-low-...er-energy-efficient-intelligence-at-the-edge/


“Both technologies are designed for standard CMOS processes, enabling cost-effective integration into System-on-Chip (SoC) designs. Early test chips have already demonstrated sub-100-microwatt operation in real-world conditions.”

I would think no genuine shareholder or investor would discount that HaiLa and Brainchip are indeed currently undertaking the engineering necessary to tape out a zero energy device for smart wearables.
 
  • Like
  • Fire
Reactions: 8 users

GazDix

Regular
When times were better, I was able to afford the Premium X subscription but never used to it research Brainchip and my subscription has nearly ran out so I thought I might as well use SuperGrok while I have it (please ignore this if you don't like AI generated stuff).

I asked it what companies could be linked with our latest patent that has just been published. Here is what it came up with:

Company TypeRelevance to PatentExample CompaniesNotes
Defense & Intelligence ContractorsHigh—Patent enables low-power, cloud-independent AI for tactical edge devices, like radar/signal processing and adaptive decision-making in contested environments. BrainChip's Akida is embedded in mission-critical systems for real-time learning.- Parsons Corporation (NYSE: PSN, via subsidiary Blue Ridge Envisioneering) - Raytheon Technologies (RTX) - Information Systems Laboratories (ISL)Parsons' 2025 multi-year deal integrates AKD1500 for defense edge-AI platforms; Raytheon collaborates on Air Force radar contracts; ISL develops drone-to-large-system radar tech using Akida's neural adaptability.
Aerospace & Space AgenciesMedium-High—Supports on-chip neural adaptation for sensors in harsh, disconnected settings (e.g., space probes, avionics). Patent's dynamic libraries align with neuromorphic radar and autonomous systems.- NASA - Front Grade Geyser (space SoC firm) - Collins Aerospace (RTX subsidiary)NASA is a marquee customer for Akida pilots; Front Grade licenses BrainChip IP for space hardware; Collins' 2025 patent cites Akida for aerospace neuromorphic processing.
Automotive & ADAS SuppliersMedium—Enhances event-based vision/audio processing for in-vehicle AI, where low-power neural reconfiguration reduces latency and energy use in self-driving tech.- Mercedes-Benz - ValeoBoth integrate Akida for ADAS (advanced driver-assistance systems); Valeo uses it for automotive sensors, leveraging dynamic functions for real-time adaptation.
Semiconductor & IP Licensing FirmsHigh—Patent protects core Akida architecture for SoC integrations, enabling licensees to embed dynamic neural libraries in custom chips for edge devices.- Renesas Electronics - Arm HoldingsRenesas licenses Akida IP for MCU/SoC lines; Arm partners on event-based compute for IoT/connected cars, amplifying the patent's reconfigurability.
Robotics & Industrial AutomationMedium—Facilitates autonomous learning in robots for manufacturing/logistics, where patent tech allows on-device neural updates without retraining.- MYWAIMYWAI embeds Akida in AI-driven robots for healthcare/industrial tasks, directly benefiting from adaptive neural functions.
IoT & Distribution/Development PlatformsLow-Medium—Broadens access to patent-protected tech for prototyping edge AI apps in consumer/IoT, via dev kits that demo dynamic libraries.- DigiKey ElectronicsMajor distributor of Akida boards/kits to developers, educators, and makers—accelerates adoption of the patent's features in custom IoT solutions.
Healthcare & Wearables InnovatorsLow—Patent's efficiency suits battery-constrained devices for bio-signal analysis, like seizure detection via adaptive neural processing.- EnsorCollaborates on AI glasses for epilepsy monitoring, using Akida's on-chip learning to dynamically tune neural functions.

Most of the results are already partners which makes sense.

Reading the language used like NASA is a 'marquee play' really reminds me that we have very little investor promotion or marketing in place.

I asked it if now this latest patent has been released, what is the process for IP contracts etc.

Answer:

Why This Is a "Usual" Process In the AI/semiconductor space, patent publications serve as milestones that:
  • Attract Partners/Customers: They demonstrate defensible innovation (e.g., Akida's on-chip adaptive learning), appealing to risk-averse sectors like defense and automotive. For instance, BrainChip's 2023 US patent grant (US 11,709,148) on similar reconfigurability tech triggered an 11% stock surge and preceded a wave of integrations.
  • Enable Licensing/Royalties: Patents like this one protect core Akida features, facilitating IP deals that convert to revenue-sharing contracts.
  • Boost Visibility: Publications often coincide with events (e.g., BrainChip's AKD1500 launch at Embedded World North America on November 4, 2025), where demos lead to follow-on agreements.
  • Historical Precedent: Across tech firms (e.g., Nvidia's GPU patents correlating with hyperscaler deals), ~60-70% of IP expansions precede partnership announcements within 3-6 months, per industry analyses. For BrainChip, this has held true—e.g., early patents (2019-2023) paved the way for marquee ties like NASA and Renesas.
However, it's not always linear: Market timing, execution, and broader AI hype (e.g., edge vs. cloud shifts) play roles. BrainChip's focus on low-power edge AI positions it well amid rising demand for sustainable tech.

MYWAI and Ensor seem like longer term plays:


CompanyLink to BrainChipDetailsTimeline & Potential Event Tie
Mywai (EU-based AIoT platform for edge intelligence in robotics/industrial sectors)Strategic partnership for next-gen edge AI solutions.Integrates Akida with Mywai's Equipment-as-a-Service (EaaS) platform for real-time AI in manufacturing, logistics, energy, and healthcare. Enables cloud-independent processing of multimodal sensor data (e.g., vision, audio) with MLOps and blockchain for EU-compliant trustworthy AI. Aimed at robotic fleets and predictive maintenance. No financial terms disclosed; focuses on joint customer deployments.Announced January 16, 2024. No specific event mentioned—likely direct negotiation. Mywai's partners (e.g., Mitsubishi Electric, Hitachi Rail) overlap with BrainChip's industrial ecosystem but no tripartite event.
Ensor (UK-based developer of AI-powered glasses for epilepsy monitoring)Collaboration for epileptic seizure-detecting wearables.Uses Akida for on-device bio-signal analysis (e.g., EEG/audio) in smart glasses, enabling low-power, adaptive neural processing for real-time seizure alerts without cloud dependency. Part of BrainChip's MedTech pilots, emphasizing battery efficiency for wearables.Ongoing since ~2024 (exact announcement date not specified in records). BrainChip's site highlights it as a "client challenge" success. No event link; appears R&D-driven. Ensor's healthcare focus aligns thematically with Mywai's MedTech angle but no crossover.
 
  • Like
  • Love
  • Wow
Reactions: 13 users

7für7

Top 20
Awkward Cricket GIF
 
  • Like
Reactions: 2 users
Top Bottom