BRN Discussion Ongoing

That went straight on ignore.
Thought you meant me for a second, then I realized I wouldn’t be able to read this


1750384382368.gif
 
  • Haha
  • Like
Reactions: 4 users

AusEire

Founding Member.
1. We all know the no 1 fan boy that made the company looked like it was about to go to the moon.
2. Look at previous AGMs where the CEO said there was going to be an explosion of revenue.

But yes, I made the decision to invest based on the hype.
Thanks for pointing that out.
Wait! So You've just admitted to the world that you invested in a company because essentially one person or in your words "fan boy" hyped it up online 🤔

And to your second point. Eh no. The CEO (LDN, Peter or Sean) at no stage DURING ANY AGM in the time I've been invested said that "there was going to be an explosion of revenue". Obviously they want revenue to increase but they never said that.

Given the conservative approach on announcing shit this simply never happened at an AGM. The only time that I can remember a statement being made with regards to sales in an AGM was by Antonio when he said that one contract could make Brainchip profitable over night and that was said at the 2024 AGM.

What you are referring to is an interview PVDM did with Commsec where he stated that he was expecting an explosion of sales to happen in time. That quote might not be exact but it's roughly what was said.
 
  • Like
  • Thinking
  • Love
Reactions: 8 users

Diogenese

Top 20
New GitHub update 20hrs ago on Akida/CNN2SNN and including TENNs release, modules and models etc by the looks.

@Diogenese will probs know more of anything unusual or new tucked in there.



20 hours ago
@ktsiknos-brainchip
ktsiknos-brainchip
2.13.0-doc-1
d8435c2
Compare
Upgrade to Quantizeml 0.16.0, Akida/CNN2SNN 2.13.0 and Akida models 1.7.0
Latest


Update QuantizeML to version 0.16.0

New features​

  • Added a bunch of sanitizing steps targetting native hardware compatibility:
    • Handle first convolution that cannot be a split layer
    • Added support for "Add > ReLU > GAP" pattern
    • Added identity layers when no merge layers are present after skip connections
    • BatchNormalisation layers are now properly folded in ConvTranspose nodes
    • Added identity layers to enforce layers to have 2 outbounds only
    • Handled Concatenate node with a duplicated input
  • Added support for TENNs ONNX models, which include sanitizing, converting to inference mode and quantizing
  • Set explicit ONNXScript requirement to 0.2.5 to prevent later versions that use numpy 2.x

Bug fixes​

  • Fixed an issue where calling sanitize twice (or sanitize then quantize) would lead to invalid ONNX graphs
  • Fixed an issue where sanitizing could lead to invalid shapes for ONNX Matmul/GEMM quantization

Update Akida and CNN2SNN to version 2.13.0

Aligned with FPGA-1679(2-nodes)/1678(6-nodes)​

New features​

  • [cnn2snn] Updated requirement to QuantizeML 0.16.0
  • [cnn2snn] Added support for ONNX QuantizedBufferTempConv and QuantizedDepthwiseBufferTempConv conversion to Akida
  • [akida] Full support for TNP-B in hardware, including partial reconfiguration with a constraint that TNP-B cannot be the first layer of a pass
  • [akida] Full support of Concatenate layers in hardware, feature set aligned on Add layers
  • [akida] Prevented the mapping of models with both TNP-B and skip connections
  • [akida] Renamed akida.NP.Mapping to akida.NP.Component
  • [akida] Improved model summary for skip connections and TNP-B layers. The summary now shows the number of required SkipDMA channels and the number of components by type.
  • [akida] Updated mapping details retrieval: model summary now contains information on external memory used. For that purpose, some C++/Python binding was updated and cleaned. The NP objects in the API have external members for memory.
  • [akida] Renamed existing virtual devices and added SixNodesIPv2 and TwoNodesIPv2 devices
  • [akida] Introduced create_device helper to build custom virtual devices
  • [akida] Mesh now needs an IP version to be built
  • [akida] Simplified model statistics API and enriched with inference and program clocks when available
  • [akida] Dropped the deprecated evaluate_sparsity tool

Update Akida models to 1.7.0

  • Updated QuantizeML dependency to 0.16.0 and CNN2SNN to 2.13.0
  • Sparsity tool name updated. Now returns python objects instead of simply displaying data and support models with skip connections
  • Introduced tenn_spatiotemporal submodule that contains model definition and training pipelines for DVS128, EyeTracking and Jester TENNs models
  • Added creation and training/evaluation CLI entry points for TENNs

Introducing TENNs modules 0.1.0

  • First release of the package that aims at providing modules for Branchip TENNs
  • Contains blocks of layers for model definition: SpatialBlock, TemporalBlock, SpatioTemporalBlock that come with compatibility checks and custom padding for Akida
  • The TemporalBlock can optionally be defined as a PleiadesLayer following https://arxiv.org/abs/2405.12179
  • An export_to_onnx helper is provided for convenience

Documentation update

  • Added documentation for TENNs APIs, including tenns_modules package
  • Introduced two spatiotemporal TENNs tutorials
  • Updated model zoo page with mAP50, removed 'example' column and added TENNs
  • Added automatic checks for broken external links and fixed a few
  • Cosmetic changes: updated main logo and copyright to 2025
The reference to reLU is interesting:

"
  • Added support for "Add > ReLU > GAP" pattern
"

There was a recent BRN talk (AL/JT ? maybe the Roadmap*?) which I understood to mean that ReLUs were replaced by LookUp Tables (LUTs) because ReLUs were very power hungry.


* Roadmap @ 7:30: https://brainchip.com/brainchip-technology-roadmap/ "activation function" in Akida 2
https://en.wikipedia.org/wiki/Activation_function
 
Last edited:
  • Like
  • Fire
Reactions: 8 users

Cardpro

Regular
Wow 99%, only thing that's more accurate is our past guessing rate where we were proven (so far) to be nearly 100% wrong - Amazon's Alexa, Merc's MBOS, Valeo's Scala 3, Qualcomm snapdragon, Nintendo Switch 2, and other techs with Edge AI capabilities/low power consumption... lol

Hopefully we are finally in one!!!

Imo only dyor
An EXTRACT from this evenings press release.


View attachment 87322







Hopefully...🤞

View attachment 87323
View attachment 87324
 
  • Like
  • Haha
Reactions: 5 users
The reference to reLU is interesting:

"
  • Added support for "Add > ReLU > GAP" pattern
"

There was a recent BRN talk (AL/JT ? maybe the Roadmap*?) which I understood to mean that ReLUs were replaced by LookUp Tables (LUTs) because ReLUs were very power hungry.


* Roadmap @ 7:30: https://brainchip.com/brainchip-technology-roadmap/ "activation function" in Akida 2
https://en.wikipedia.org/wiki/Activation_function
So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?
 
  • Like
Reactions: 2 users

jrp173

Regular
Wait! So You've just admitted to the world that you invested in a company because essentially one person or in your words "fan boy" hyped it up online 🤔

And to your second point. Eh no. The CEO (LDN, Peter or Sean) at no stage DURING ANY AGM in the time I've been invested said that "there was going to be an explosion of revenue". Obviously they want revenue to increase but they never said that.

Given the conservative approach on announcing shit this simply never happened at an AGM. The only time that I can remember a statement being made with regards to sales in an AGM was by Antonio when he said that one contract could make Brainchip profitable over night and that was said at the 2024 AGM.

What you are referring to is an interview PVDM did with Commsec where he stated that he was expecting an explosion of sales to happen in time. That quote might not be exact but it's roughly what was said.

1750396511271.png
1750396532395.png

1750396583302.png
1750396692495.png
 
  • Like
  • Love
  • Sad
Reactions: 10 users

Doz

Regular
The reference to reLU is interesting:

"
  • Added support for "Add > ReLU > GAP" pattern
"

There was a recent BRN talk (AL/JT ? maybe the Roadmap*?) which I understood to mean that ReLUs were replaced by LookUp Tables (LUTs) because ReLUs were very power hungry.


* Roadmap @ 7:30: https://brainchip.com/brainchip-technology-roadmap/ "activation function" in Akida 2
https://en.wikipedia.org/wiki/Activation_function



ReLU strikes back …..


1750397873198.png

 
  • Fire
  • Like
  • Love
Reactions: 5 users

Diogenese

Top 20
So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?
The inner workings of Akida software are beyond my pay grade. However, the LUTs were introduced in Akida 2, probably as part of the 8 times efficiency improvement over Akida 1. If Akida 1 uses ReLU, this may be a tweak.

Further efficiency is provided in the secret sauce of the memory data movement patent applications which were in preparation at the time of the Roadmap broadcast.

My understanding* of ReLU activation function comes from the Wiki page. The neuron output is modified by multplying by the ReLU function. It could be a crude step function between -1 and +1, os the smooth S-shaped sigmoid transition, or some other comparable function:

https://en.wikipedia.org/wiki/Activation_function

* "understanding" is an exaggeration. "information about" is more accurate.
 
  • Like
  • Love
Reactions: 5 users

AusEire

Founding Member.
You've gone a major league highlighting rampage.

Saying "They are looking forward to" is a bit different to "there will be an explosion of sales".

Saying that "there will be" implies that it will happen.
 
  • Haha
Reactions: 2 users

Diogenese

Top 20
ReLU strikes back …..


View attachment 87380
Hi Doz,

Great research,

According to the abstract, GELU and SiLu are more processor intensive.

The abstract compares ReLU with lemons, but we are using cherries.

If anything, this abstract presents a stronger case for LUTs compared with alternatives.
 
  • Like
  • Fire
Reactions: 4 users

Diogenese

Top 20
Share price brings me back to Coleridge again:
"Like a painted ship upon a painted ocean."
 
  • Haha
  • Like
Reactions: 3 users

Luppo71

Founding Member
  • Like
  • Haha
Reactions: 4 users

Diogenese

Top 20
So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?
This is not necessarily the correct explanation, but I imagine they would be better off with a tailor-made LUT, one-and-done.

The ReLU can be represented as a graphical plot (X and Y axes), with unique Y values for each X value determined by the ReLU formula.

The issue with ReLU is that, each time it is used, it has to do the calculation from scratch, whereas, the ReLU performance can be mapped to a LUT once and for all, no further calculation required.
 
  • Like
  • Love
Reactions: 3 users

neuramot

Emerged
I kept quiet about being scammed because I was embarrassed. But silence only made it worse. AptRecoup made it okay to speak up. They never judged me or minimized what happened. Instead, they gave me direction, support, and a voice. Their strength comes from experience—they’ve been through the same nightmare. That shared connection is what makes their work so powerful. If you’re feeling trapped by shame or doubt, I urge you to email s u p p o r t @ a p t r e c o u p . c o m . You don’t need to suffer in silence. They’re here, and they get it.
 

Esq.111

Fascinatingly Intuitive.
Evening neuromut,

Fuck off.

Esq.
 
  • Haha
Reactions: 6 users

miaeffect

Oat latte lover
I kept quiet about being scammed because I was embarrassed. But silence only made it worse. AptRecoup made it okay to speak up. They never judged me or minimized what happened. Instead, they gave me direction, support, and a voice. Their strength comes from experience—they’ve been through the same nightmare. That shared connection is what makes their work so powerful. If you’re feeling trapped by shame or doubt, I urge you to email s u p p o r t @ a p t r e c o u p . c o m . You don’t need to suffer in silence. They’re here, and they get it.
Screenshot_20250620-214020_Chrome.jpg
 
  • Fire
  • Like
  • Love
Reactions: 6 users

Diogenese

Top 20
  • Haha
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I kept quiet about being scammed because I was embarrassed. But silence only made it worse. AptRecoup made it okay to speak up. They never judged me or minimized what happened. Instead, they gave me direction, support, and a voice. Their strength comes from experience—they’ve been through the same nightmare. That shared connection is what makes their work so powerful. If you’re feeling trapped by shame or doubt, I urge you to email s u p p o r t @ a p t r e c o u p . c o m . You don’t need to suffer in silence. They’re here, and they get it.

Dear Numnut,

I would love to buy solar panels from you!

After receiving 459,567 phone calls from solar-panel-peddlers this year alone, I can happily report that your post on this forum has really captured my attention.

Something about you wreaks of genuinity and I now regard you as someone I can really trust. Is it too early for me to call you my friend? Perhaps we will find something mutually beneficial in our burgeoning relationship, as I too am an entrepreneur.

Can you please divulge to me your mobile phone number as I’d dearly like, nay love❤️, to have a long conversation with you about solar panels and…. toilet paper, which is the topic closest to my heart and my wallet, since it is what my core business is about. And this is where I need you.

I have a stockpile of toilet paper being held in quarantine at the Docklands wharf . If you can guarantee me a winning deal on your solar panels, I will promise you a lifetime’s supply of premium quality, 1ply shit tickets.🚽

Offer ends soon.

This message will self destruct in 10 seconds.

Bravo 🧻
 
  • Haha
Reactions: 2 users
Top Bottom