BRN Discussion Ongoing

Frangipani

Top 20
First BRN press release I am aware of that was written by someone from Bospar Communications, the Public Relations & Marketing Agency that was recently hired:


Launch of BrainChip Developer Hub Accelerates Event-Based AI Innovation on Akida™ Platform with Release of MetaTF 2.13​

NEWS PROVIDED BY
Brainchip
June 19, 2025, 06:21 GMT



BrainChip announces new Developer Hub and MetaTF toolkit, enabling seamless development and deployment of machine learning models on its Akida™ platform.

LAGUNA HILLS, CA, UNITED STATES, June 19, 2025 /EINPresswire.com/ --

BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based brain-inspired AI, today announced the release of MetaTF 2.13 on its newly launched Developer Hub, a comprehensive portal designed to accelerate AI development on the Akida™ platform. The BrainChip Developer Hub serves as a centralized resource for developers building intelligent edge applications, providing access to tools, pre-trained models, technical documentation, and the company’s MetaTF toolkit. MetaTF 2.13 features seamless conversion, quantization, and deployment of machine learning models on Akida. It is compatible with leading frameworks including Keras and ONNX with support for Jupyter Notebooks, enabling rapid prototyping and optimization.

“We created the Developer Hub to streamline the experience for edge AI developers and give them the tools to move from concept to deployment quickly,”, said Sean Hehir, CEO of BrainChip. “With our Akida processor, highly intuitive software stack, and world class models, we’re delivering solutions that are both high-performing and energy-efficient.”

As part of this launch, BrainChip introduced two high-efficiency models optimized for edge performance. The eye-tracking model is ideal for smart glasses and wearable devices, delivering over 99% accuracy. Built on BrainChip’s proprietary Temporal Event-based Neural Networks (TENNs), it offers real-time gaze detection while dramatically reducing power consumption by processing only motion-relevant data.

The gesture recognition model is designed for embedded applications in consumer electronics, robotics, and IoT and achieves 97% accuracy. By leveraging Akida’s event-based processing and high-speed vision sensors from event-based cameras, it enables ultra-low latency gesture interfaces without sacrificing precision.

These models demonstrate the power of Akida’s event-based architecture across a wide array of real-world applications including autonomous vehicles, industrial automation, AR/VR and spatial computing, smart environments and IoT, and security and surveillance.

BrainChip’s new Developer Hub and AI models underscore the company’s commitment to making edge AI more accessible and scalable. With Akida, developers can build responsive, privacy-aware applications that operate at ultra-low power—ideal for battery-constrained and latency-sensitive environments.

Developers can access the models and tools today by visiting: https://developer.brainchip.com

About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is the global leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, AkidaTM, uses neuromorphic principles to mimic the human brain. By analyzing only the essential sensor inputs at the point of acquisition, Akida delivers data processing with unmatched efficiency, precision, and energy savings. Integrated into SoCs on any digital process technology, Akida Neural Processor IP has demonstrated significant advantages across today's workloads and networks. It provides a platform for developers to build, fine-tune, and run their models using standard AI tools such as TensorFlow and Keras.

BrainChip’s Temporal Event-based Neural Networks (TENNs) build on state space models (SSMs) by introducing a time-sensitive, event-driven processing framework that enhances efficiency and makes them ideal for real-time, streaming Edge applications. By enabling efficient computation with optimized models and hardware execution, BrainChip makes real-time streaming Edge AI universally deployable across industries such as aerospace, autonomous vehicles, robotics, mobile, consumer electronics, and wearable technology. BrainChip is leading the way toward a future where ultra-low power, on-chip AI near the sensor not only transforms products but also benefits the planet. Learn more at www.brainchip.com.

Follow BrainChip on Twitter: @BrainChip_inc
Follow BrainChip on LinkedIn: BrainChip LinkedIn

Madeline Coe
Bospar Communications
+1 224-433-9056

maddie@bospar.com

Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
article.gif
 
  • Like
  • Fire
  • Love
Reactions: 27 users
I'm out if there's another cap raise

Don’t think we’ve finished raising funds from the last one? and at the current SP it’s probably going to be way short of what the company hoped for I guess. So another raise will be due with absolutely no sign of any decent announcement as we all hoped for.
 
  • Like
  • Sad
Reactions: 6 users

Frangipani

Top 20
On a brighter note, BrainChip will be attending four large-scale events later this month:

View attachment 86433

We’d already found out about three of the four events prior to the newsletter release. Here is some more info about the fourth one, the Living Planet Symposium in Vienna, organised by ESA.

Douglas McLelland and Gilles Bézard will be representing BrainChip with a talk titled “Event-driven computation and sparse neural network activity deliver low power AI” as part of the session Orbital Intelligence for Earth Observation applications: The edge of AI In Space, chaired by Gabriele Meoni from ESA, who has first-hand experience of AKD1000 cf. https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-462334


View attachment 86435

View attachment 86434 View attachment 86436 View attachment 86437 View attachment 86438 View attachment 86439 View attachment 86440

We’d already found out that Douglas McLelland and Gilles Bézard will be representing BrainChip at the upcoming Living Planet Symposium in Vienna (organised by ESA) thanks to a reference to that conference in the latest BRN newsletter (see my 8 June post for more details 👆🏻).

That event in Austria’s capital now also shows up on our website under “What’s New”:


D43256E3-082A-428A-979C-E2EDA434BE15.jpeg



36514338-E625-4654-BB83-46D7164766F7.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 16 users

DK6161

Regular
You were lead to believe in the company?

By who?(Be specific)

When you invest in a company generally you do some kind of research on the company before you invest in it.

Did YOU do this? Then when YOU decided to throw money at it. Did YOU continue to do research and keep updated on whats going on?

You say you No Longer believe that the company is going to succeed. So obviously YOU have seen something that has brought YOU to that conclusion.

Notice how I emphasised the word YOU there. That implies that it is YOU who made the decision to firstly research what the company does, then YOU decided to invest, then YOU continued to research and follow the progress of the company and now YOU find yourself no longer believing in what the company is trying to achieve for reasons XYZ.

To be honest with you if I were in your shoes I know exactly what I would do next. I'm not going to tell you what that is because that is a decision that ONLY YOU can make.

For the record we all make bad decisions and anyone that says otherwise is flat out lying. I've sold shares in 2 companies this year. 1 because I bought hype and got my ass handed to me and the other funnily enough I should have held on to as it has now done a x3 from my original buy price. I sold it at 150% profit.

I'm still invested in Brainchip because I still firmly believe that they will be successful. That is MY BELIEF. Taking a bit longer than I expected but hey I can't control tWe all know who the number 1 fan boy that made the company look like a "no brainer" to invest in.
1. We all know the no 1 fan boy that made the company looked like it was about to go to the moon.
2. Look at previous AGMs where the CEO said there was going to be an explosion of revenue.

But yes, I made the decision to invest based on the hype.
Thanks for pointing that out.
 
  • Like
  • Sad
  • Fire
Reactions: 5 users

DK6161

Regular
I'm out if there's another cap raise
Like @AusEire said.
Don't blame the company mate. It is your fault for throwing money at it.
 
  • Like
  • Haha
  • Fire
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Back in January, after listening to a podcast featuring Steven Brightfield, I mentioned that I had the impression he may have been hinting at Meta as one of the companies involved in discussions around smart glasses (as above).

So, it’s with great interest that I’ve seen reports today about a new collaboration between Meta and Oakley on a sports-focused line of smart glasses, with an official announcement reportedly scheduled for Friday, 20 June.

The partnership builds on the existing relationship between Oakley and Ray-Ban under their parent company, EssilorLuxottica, which already works closely with Meta on the Ray-Ban Meta smart glasses.

I’ll definitely be tuning into this announcement to learn more. Early reports suggest the glasses are being optimised for sports and athletic use, with a focus on hands-free video recording capabilities. The announcement will also likely reveal specific features, pricing, and availability for the glasses.

Over the past few days, I’ve noticed a recurring theme across industry commentary - battery life continues to be a major challenge for smart glasses, particularly when video capture is involved, which significantly increases power consumption.

Thanks to our collaboration with Onsor, we know BrainChip’s Akida technology can support all-day battery life without the need for external battery packs or cloud connectivity. That’s a capability that most players in this space are still chasing.

If Meta plans to push harder into high-performance, video-centric smart glasses, it seems likey they would need a low-power solution like ours to get them across the line. Edit: As pointed out to me by @Diogenese, monitoring for early signs of a seizure, as in the Onsor glasses, is largely a passive function, similar to wake-word detection. In contrast, continuous video processing and classification is a significantly more active workload for any processor. So, while Akida with TENNs could theoretically help extend battery life, we’d need to wait for real-world performance data before drawing conclusions about its impact in such demanding use cases.

And - whether it’s meaningful or not - we’ve seen a few likes from Meta employees on LinkedIn.

Oh, and then there's also the small fact that Sean recently confirmed that the company that is manufacturing the glasses for Onsor is the same one doing them for Meta, EssilorLuxottica.

So, I welcome you to draw your own conclusions.



Video Start Time : 14.26 when Sean mentions the Onsor frames are made by EssilorLuxottica ,who also make the Meta frames.










View attachment 87214




An EXTRACT from this evenings press release.


Screenshot 2025-06-19 at 8.50.40 pm.png








Hopefully...🤞

Screenshot 2025-06-19 at 9.17.25 pm.png

Screenshot 2025-06-19 at 9.19.03 pm.png
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 15 users

manny100

Top 20
Its interesting how AKIDA and TENNs co exist. Bravo's post above concerning Onsor's epilepsy detectors is a good example. AKIDA the hardware provides the signals and TENNs (with perhaps software designed by Onsor?) does the thinking, tells AKIDA who provides the response, eg seizure within an hour.
They are complementary. TENNs are the brains, and Akida is the body.
Another example, Eg, if i am hiking in bush i have never been to before i could use a drone for forward navigation purposes.
If for some reason i wanted to communicate with the drone via hand signals, eg raised arm means return etc. A camera or other suitable device would pick up my arm motions.
AKIDA provides the cloud free, event based, low power, on chip learning platform.
TENNs do the 'thinking'.
Akida collects and pre-processes that event data in real time—say, my arm moves upward.
It then runs the TENNs model internally to interpret what that movement means—maybe it’s my gesture for “come back and hover.”
Then, Akida enables the appropriate response: the drone returns and hovers.
It's all in real time on low power. Traditional Edge AI would run out of power in no time.
The above is very basically how AKIDA and TENNs run together.
The key is that AKIDA is the hardware and it needs software, eg TENNs to do the 'thinking'.
Easy to see why tiny PICO and TENNs are a good pair.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 9 users

FiveBucks

Regular
I'm out if there's another cap raise
It is inevitable.

They said at the AGM they are aiming for $9 mill revenue this year. Our cash burn is significantly higher.
 
  • Like
  • Sad
  • Haha
Reactions: 7 users
New GitHub update 20hrs ago on Akida/CNN2SNN and including TENNs release, modules and models etc by the looks.

@Diogenese will probs know more of anything unusual or new tucked in there.



20 hours ago
@ktsiknos-brainchip
ktsiknos-brainchip
2.13.0-doc-1
d8435c2
Compare
Upgrade to Quantizeml 0.16.0, Akida/CNN2SNN 2.13.0 and Akida models 1.7.0
Latest


Update QuantizeML to version 0.16.0

New features​

  • Added a bunch of sanitizing steps targetting native hardware compatibility:
    • Handle first convolution that cannot be a split layer
    • Added support for "Add > ReLU > GAP" pattern
    • Added identity layers when no merge layers are present after skip connections
    • BatchNormalisation layers are now properly folded in ConvTranspose nodes
    • Added identity layers to enforce layers to have 2 outbounds only
    • Handled Concatenate node with a duplicated input
  • Added support for TENNs ONNX models, which include sanitizing, converting to inference mode and quantizing
  • Set explicit ONNXScript requirement to 0.2.5 to prevent later versions that use numpy 2.x

Bug fixes​

  • Fixed an issue where calling sanitize twice (or sanitize then quantize) would lead to invalid ONNX graphs
  • Fixed an issue where sanitizing could lead to invalid shapes for ONNX Matmul/GEMM quantization

Update Akida and CNN2SNN to version 2.13.0

Aligned with FPGA-1679(2-nodes)/1678(6-nodes)​

New features​

  • [cnn2snn] Updated requirement to QuantizeML 0.16.0
  • [cnn2snn] Added support for ONNX QuantizedBufferTempConv and QuantizedDepthwiseBufferTempConv conversion to Akida
  • [akida] Full support for TNP-B in hardware, including partial reconfiguration with a constraint that TNP-B cannot be the first layer of a pass
  • [akida] Full support of Concatenate layers in hardware, feature set aligned on Add layers
  • [akida] Prevented the mapping of models with both TNP-B and skip connections
  • [akida] Renamed akida.NP.Mapping to akida.NP.Component
  • [akida] Improved model summary for skip connections and TNP-B layers. The summary now shows the number of required SkipDMA channels and the number of components by type.
  • [akida] Updated mapping details retrieval: model summary now contains information on external memory used. For that purpose, some C++/Python binding was updated and cleaned. The NP objects in the API have external members for memory.
  • [akida] Renamed existing virtual devices and added SixNodesIPv2 and TwoNodesIPv2 devices
  • [akida] Introduced create_device helper to build custom virtual devices
  • [akida] Mesh now needs an IP version to be built
  • [akida] Simplified model statistics API and enriched with inference and program clocks when available
  • [akida] Dropped the deprecated evaluate_sparsity tool

Update Akida models to 1.7.0

  • Updated QuantizeML dependency to 0.16.0 and CNN2SNN to 2.13.0
  • Sparsity tool name updated. Now returns python objects instead of simply displaying data and support models with skip connections
  • Introduced tenn_spatiotemporal submodule that contains model definition and training pipelines for DVS128, EyeTracking and Jester TENNs models
  • Added creation and training/evaluation CLI entry points for TENNs

Introducing TENNs modules 0.1.0

  • First release of the package that aims at providing modules for Branchip TENNs
  • Contains blocks of layers for model definition: SpatialBlock, TemporalBlock, SpatioTemporalBlock that come with compatibility checks and custom padding for Akida
  • The TemporalBlock can optionally be defined as a PleiadesLayer following https://arxiv.org/abs/2405.12179
  • An export_to_onnx helper is provided for convenience

Documentation update

  • Added documentation for TENNs APIs, including tenns_modules package
  • Introduced two spatiotemporal TENNs tutorials
  • Updated model zoo page with mAP50, removed 'example' column and added TENNs
  • Added automatic checks for broken external links and fixed a few
  • Cosmetic changes: updated main logo and copyright to 2025
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Diogenese

Top 20
Hmmmm...

Intel’s is overhauling its engineering leadership team in an effort to help with its AI comeback. New hires include executives such as Jean-Didier Allegrucci (AI SoC) and Shailendra Desai (AI architecture) signals a pivot toward AI-first development, with neuromorphic computing potentially in the spotlight.



EXTRACT 1
View attachment 87300


EXTRACT 2
View attachment 87301







EXTRACT

View attachment 87302
In the news today,
Apparently Altman does not believe in the once bitten, twice shy adage.

When he first pledged $50M to Rain AI, their proposed product, confusing complexity with chaos, was a spaghetti bowl of "self-organizing" nanowires which were supposed to autonomously establish analog neuronal connexions.

A couple of years down the track, they abandoned this and switched to digital neurons. Now they appear to be workiing on hybrid digital/analog with ADC/DAC

Still, as they say, another $150M is neither here nor there.

https://finance.yahoo.com/news/sam-...TPcs9cWc0tQ7adDpJ0Knav5r36EtB13T-ts0zmG9uIl0Z

Benzinga

20250524​

Sam Altman's $150M AI Chip Bet Crashes: Rain AI Faces Sale As OpenAI, Nvidia, And Microsoft Circle The Wreckage​

 
Last edited:
  • Like
  • Wow
  • Fire
Reactions: 4 users

Diogenese

Top 20
An EXTRACT from this evenings press release.


View attachment 87322







Hopefully...🤞

View attachment 87323
View attachment 87324
It will be interesting to see some power usage figures for the glasses.
 
  • Like
  • Fire
Reactions: 4 users
Hire The Best Crypto Recovery Specialists: HIRE CERTIFIED RECOVERY SERVICES

I met someone on a kink connect app, and our conversations quickly escalated. The chemistry was palpable, and what started as casual chats soon turned into deep discussions about our desires and fantasies. The individual seemed knowledgeable and charismatic, which made me feel at ease. As we exchanged messages, I found myself drawn into their world, captivated by their charm and the promise of an exciting connection. Our relationship developed, the conversation took a surprising turn. The person introduced the idea of investing in cryptocurrency, claiming it was a fantastic opportunity to grow wealth quickly. At first, I had heard about the volatility of crypto markets and the risks involved. But the allure of potential profits, combined with the trust I had built in our interactions, made me reconsider. The scammer provided what appeared to be legitimate instructions for transferring my crypto assets to a banking app based in Singapore. They assured me that this was a safe and secure method to invest, emphasizing the potential returns I could reap. Trusting their guidance, I decided to take the plunge. I sent Ethereum (ETH), Bitcoin (BTC), and USDT in three separate transactions, each time feeling a mix of excitement and apprehension. I believed I was making a wise financial decision, convinced that I was on the brink of a lucrative investment. It wasn’t until I attempted to withdraw my supposed profits that the reality of the situation hit me. The app became unresponsive, and my attempts to contact the scammer went unanswered. Panic set in as I realized I had fallen victim to a sophisticated scam. I had lost thousands of dollars, and the weight of that loss was crushing. Desperate to recover my funds, I turned to CERTIFIED RECOVERY SERVICES, a company specializing in cryptocurrency recovery. Their expertise in tracing lost assets gave me hope. The team at CERTIFIED RECOVERY SERVICES conducted a thorough blockchain forensic analysis and was able to trace the scammer’s wallet. This intricate process required specialized knowledge, and I was grateful to have their support in navigating this challenging situation. After several days of investigation and communication with various financial institutions, CERTIFIED RECOVERY SERVICES successfully recovered all my funds, totaling approximately $287,000. The relief I felt was overwhelming; I had not only regained my financial stability but also learned a valuable lesson about the importance of vigilance in online interactions. This has made me more cautious in my online dealings. While the world of online dating can be exciting, it is crucial to remain aware of the potential risks, especially when financial transactions are involved. Thanks to CERTIFIED RECOVERY SERVICES, I was able to reclaim my lost assets.
Here's Their Info Below:
WhatsApp: (+1(740)258‑1417 )
Telegram: https: //t.me/certifiedrecoveryservices
mail: (certifiedrecoveryservices @zohomail .com, certified @financier .com)
Website info;( https: //certifiedrecoveryservices .com)
 

Attachments

  • WhatsApp Image 2024-12-24 at 8.23.12 PM (1).jpeg
    WhatsApp Image 2024-12-24 at 8.23.12 PM (1).jpeg
    86.4 KB · Views: 1

Frangipani

Top 20
The 40th Space Symposium in Colorado Springs is in full swing.

Josef Aschbacher, the European Space Agency’s Director General, reminisces about a warm summer night in his childhood that kindled the flame of curiosity in him and laid the foundation for his life-long fascination with space.

In his post, he also shares a link to ESA’s Strategy 2040, of which you’ll find a summary below:


View attachment 81805



(For the Strategy 2040 In Focus and Strategy 2040 In Depth versions, click here)

View attachment 81800 View attachment 81801 View attachment 81802 View attachment 81803 View attachment 81804

On Day 2 of the Paris Air Show 2025 at Le Bourget, ESA released Technology 2040, their vision for the next 15 years and beyond, which “fully supports and enhances” ESA’s Strategy 2040, published earlier this year 👆🏻.

Laurent Hili is named as one of the contact persons for the subtopic “AI-DRIVEN INNOVATIONS IN SPACE TECHNOLOGIES”.


BB5199D6-53DF-4759-80B5-6E9C1F4A36F6.jpeg




Some excerpts:

b7e57c70-6139-4c86-9c6a-b44151636b22-jpeg.87338

5BEE8F94-0E08-4F33-A260-FB8C621D8287.jpeg

8CD7BBB7-90FB-4A87-AE79-3C380229AE29.jpeg
6463F0EE-BDD7-4574-AF23-B3B195F93BBA.jpeg
DC8BDB57-485C-4BB0-8DE9-047196FA0D85.jpeg

AD98A6CF-849D-4FFB-9A81-E21789190A1B.jpeg
111ADCF7-A7F7-48C3-94B0-A84AC51EEC29.jpeg
 

Attachments

  • B7E57C70-6139-4C86-9C6A-B44151636B22.jpeg
    B7E57C70-6139-4C86-9C6A-B44151636B22.jpeg
    827.2 KB · Views: 93
  • Like
  • Love
Reactions: 6 users
Hire The Best Crypto Recovery Specialists: HIRE CERTIFIED RECOVERY SERVICES

I met someone on a kink connect app, and our conversations quickly escalated. The chemistry was palpable, and what started as casual chats soon turned into deep discussions about our desires and fantasies. The individual seemed knowledgeable and charismatic, which made me feel at ease. As we exchanged messages, I found myself drawn into their world, captivated by their charm and the promise of an exciting connection. Our relationship developed, the conversation took a surprising turn. The person introduced the idea of investing in cryptocurrency, claiming it was a fantastic opportunity to grow wealth quickly. At first, I had heard about the volatility of crypto markets and the risks involved. But the allure of potential profits, combined with the trust I had built in our interactions, made me reconsider. The scammer provided what appeared to be legitimate instructions for transferring my crypto assets to a banking app based in Singapore. They assured me that this was a safe and secure method to invest, emphasizing the potential returns I could reap. Trusting their guidance, I decided to take the plunge. I sent Ethereum (ETH), Bitcoin (BTC), and USDT in three separate transactions, each time feeling a mix of excitement and apprehension. I believed I was making a wise financial decision, convinced that I was on the brink of a lucrative investment. It wasn’t until I attempted to withdraw my supposed profits that the reality of the situation hit me. The app became unresponsive, and my attempts to contact the scammer went unanswered. Panic set in as I realized I had fallen victim to a sophisticated scam. I had lost thousands of dollars, and the weight of that loss was crushing. Desperate to recover my funds, I turned to CERTIFIED RECOVERY SERVICES, a company specializing in cryptocurrency recovery. Their expertise in tracing lost assets gave me hope. The team at CERTIFIED RECOVERY SERVICES conducted a thorough blockchain forensic analysis and was able to trace the scammer’s wallet. This intricate process required specialized knowledge, and I was grateful to have their support in navigating this challenging situation. After several days of investigation and communication with various financial institutions, CERTIFIED RECOVERY SERVICES successfully recovered all my funds, totaling approximately $287,000. The relief I felt was overwhelming; I had not only regained my financial stability but also learned a valuable lesson about the importance of vigilance in online interactions. This has made me more cautious in my online dealings. While the world of online dating can be exciting, it is crucial to remain aware of the potential risks, especially when financial transactions are involved. Thanks to CERTIFIED RECOVERY SERVICES, I was able to reclaim my lost assets.
Here's Their Info Below:
WhatsApp: (+1(740)258‑1417 )
Telegram: https: //t.me/certifiedrecoveryservices
mail: (certifiedrecoveryservices @zohomail .com, certified @financier .com)
Website info;( https: //certifiedrecoveryservices .com)
@zeeb0t
 
  • Like
Reactions: 2 users
Top Bottom