Pom down under
Top 20
Thought you meant me for a second, then I realized I wouldn’t be able to read thisThat went straight on ignore.
Thought you meant me for a second, then I realized I wouldn’t be able to read thisThat went straight on ignore.
Wait! So You've just admitted to the world that you invested in a company because essentially one person or in your words "fan boy" hyped it up online1. We all know the no 1 fan boy that made the company looked like it was about to go to the moon.
2. Look at previous AGMs where the CEO said there was going to be an explosion of revenue.
But yes, I made the decision to invest based on the hype.
Thanks for pointing that out.
The reference to reLU is interesting:New GitHub update 20hrs ago on Akida/CNN2SNN and including TENNs release, modules and models etc by the looks.
@Diogenese will probs know more of anything unusual or new tucked in there.
![]()
Releases · Brainchip-Inc/akida_examples
Brainchip Akida Neuromorphic System-on-Chip examples and documentation. - Brainchip-Inc/akida_examplesgithub.com
20 hours ago
ktsiknos-brainchip![]()
2.13.0-doc-1
d8435c2
Compare
Upgrade to Quantizeml 0.16.0, Akida/CNN2SNN 2.13.0 and Akida models 1.7.0
Latest
Update QuantizeML to version 0.16.0
New features
- Added a bunch of sanitizing steps targetting native hardware compatibility:
- Handle first convolution that cannot be a split layer
- Added support for "Add > ReLU > GAP" pattern
- Added identity layers when no merge layers are present after skip connections
- BatchNormalisation layers are now properly folded in ConvTranspose nodes
- Added identity layers to enforce layers to have 2 outbounds only
- Handled Concatenate node with a duplicated input
- Added support for TENNs ONNX models, which include sanitizing, converting to inference mode and quantizing
- Set explicit ONNXScript requirement to 0.2.5 to prevent later versions that use numpy 2.x
Bug fixes
- Fixed an issue where calling sanitize twice (or sanitize then quantize) would lead to invalid ONNX graphs
- Fixed an issue where sanitizing could lead to invalid shapes for ONNX Matmul/GEMM quantization
Update Akida and CNN2SNN to version 2.13.0
Aligned with FPGA-1679(2-nodes)/1678(6-nodes)
New features
- [cnn2snn] Updated requirement to QuantizeML 0.16.0
- [cnn2snn] Added support for ONNX QuantizedBufferTempConv and QuantizedDepthwiseBufferTempConv conversion to Akida
- [akida] Full support for TNP-B in hardware, including partial reconfiguration with a constraint that TNP-B cannot be the first layer of a pass
- [akida] Full support of Concatenate layers in hardware, feature set aligned on Add layers
- [akida] Prevented the mapping of models with both TNP-B and skip connections
- [akida] Renamed akida.NP.Mapping to akida.NP.Component
- [akida] Improved model summary for skip connections and TNP-B layers. The summary now shows the number of required SkipDMA channels and the number of components by type.
- [akida] Updated mapping details retrieval: model summary now contains information on external memory used. For that purpose, some C++/Python binding was updated and cleaned. The NP objects in the API have external members for memory.
- [akida] Renamed existing virtual devices and added SixNodesIPv2 and TwoNodesIPv2 devices
- [akida] Introduced create_device helper to build custom virtual devices
- [akida] Mesh now needs an IP version to be built
- [akida] Simplified model statistics API and enriched with inference and program clocks when available
- [akida] Dropped the deprecated evaluate_sparsity tool
Update Akida models to 1.7.0
- Updated QuantizeML dependency to 0.16.0 and CNN2SNN to 2.13.0
- Sparsity tool name updated. Now returns python objects instead of simply displaying data and support models with skip connections
- Introduced tenn_spatiotemporal submodule that contains model definition and training pipelines for DVS128, EyeTracking and Jester TENNs models
- Added creation and training/evaluation CLI entry points for TENNs
Introducing TENNs modules 0.1.0
- First release of the package that aims at providing modules for Branchip TENNs
- Contains blocks of layers for model definition: SpatialBlock, TemporalBlock, SpatioTemporalBlock that come with compatibility checks and custom padding for Akida
- The TemporalBlock can optionally be defined as a PleiadesLayer following https://arxiv.org/abs/2405.12179
- An export_to_onnx helper is provided for convenience
Documentation update
- Added documentation for TENNs APIs, including tenns_modules package
- Introduced two spatiotemporal TENNs tutorials
- Updated model zoo page with mAP50, removed 'example' column and added TENNs
- Added automatic checks for broken external links and fixed a few
- Cosmetic changes: updated main logo and copyright to 2025
An EXTRACT from this evenings press release.
View attachment 87322
Launch of BrainChip Developer Hub Accelerates Event-Based AI Innovation on Akida™ Platform with Release of MetaTF 2.13
BrainChip announces new Developer Hub and MetaTF toolkit, enabling seamless development and deployment of machine learning models on its Akida™ platform.www.einpresswire.com
Hopefully...
View attachment 87323
View attachment 87324
So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?The reference to reLU is interesting:
"
"
- Added support for "Add > ReLU > GAP" pattern
There was a recent BRN talk (AL/JT ? maybe the Roadmap*?) which I understood to mean that ReLUs were replaced by LookUp Tables (LUTs) because ReLUs were very power hungry.
* Roadmap @ 7:30: https://brainchip.com/brainchip-technology-roadmap/ "activation function" in Akida 2
https://en.wikipedia.org/wiki/Activation_function
Wait! So You've just admitted to the world that you invested in a company because essentially one person or in your words "fan boy" hyped it up online
And to your second point. Eh no. The CEO (LDN, Peter or Sean) at no stage DURING ANY AGM in the time I've been invested said that "there was going to be an explosion of revenue". Obviously they want revenue to increase but they never said that.
Given the conservative approach on announcing shit this simply never happened at an AGM. The only time that I can remember a statement being made with regards to sales in an AGM was by Antonio when he said that one contract could make Brainchip profitable over night and that was said at the 2024 AGM.
What you are referring to is an interview PVDM did with Commsec where he stated that he was expecting an explosion of sales to happen in time. That quote might not be exact but it's roughly what was said.
The reference to reLU is interesting:
"
"
- Added support for "Add > ReLU > GAP" pattern
There was a recent BRN talk (AL/JT ? maybe the Roadmap*?) which I understood to mean that ReLUs were replaced by LookUp Tables (LUTs) because ReLUs were very power hungry.
* Roadmap @ 7:30: https://brainchip.com/brainchip-technology-roadmap/ "activation function" in Akida 2
https://en.wikipedia.org/wiki/Activation_function
The inner workings of Akida software are beyond my pay grade. However, the LUTs were introduced in Akida 2, probably as part of the 8 times efficiency improvement over Akida 1. If Akida 1 uses ReLU, this may be a tweak.So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?
Hi Doz,ReLU strikes back …..
View attachment 87380
ReLU Strikes Back: Exploiting Activation Sparsity in Large Language...
Large Language Models (LLMs) with billions of parameters have drastically transformed AI applications. However, their demanding computation during inference has raised significant challenges for...openreview.net
Dont believe it, we have a 3 year lead.World's first ?.
This is not necessarily the correct explanation, but I imagine they would be better off with a tailor-made LUT, one-and-done.So given ReLU is power hungry & we replaced (as standard?) with LUT then is it possible someone specifically wants ReLU availability hence them still being listed in Git and docs?
I kept quiet about being scammed because I was embarrassed. But silence only made it worse. AptRecoup made it okay to speak up. They never judged me or minimized what happened. Instead, they gave me direction, support, and a voice. Their strength comes from experience—they’ve been through the same nightmare. That shared connection is what makes their work so powerful. If you’re feeling trapped by shame or doubt, I urge you to email s u p p o r t @ a p t r e c o u p . c o m . You don’t need to suffer in silence. They’re here, and they get it.
Another one on ignore, that's 2 today.
Whack-a-mole.Another one on ignore, that's 2 today.
I kept quiet about being scammed because I was embarrassed. But silence only made it worse. AptRecoup made it okay to speak up. They never judged me or minimized what happened. Instead, they gave me direction, support, and a voice. Their strength comes from experience—they’ve been through the same nightmare. That shared connection is what makes their work so powerful. If you’re feeling trapped by shame or doubt, I urge you to email s u p p o r t @ a p t r e c o u p . c o m . You don’t need to suffer in silence. They’re here, and they get it.