BRN Discussion Ongoing

HopalongPetrovski

I'm Spartacus!
“Neuromorphic compute like Akida saves on scarce resources like power while extending what can be done in line with Symphony's capabilities.
On the IBM question, I can't speak to IBM's product plans. The market typically drives that conversation.”

Why can’t people just commit and call it what it is?
If he says “like Akida,” but the thing he’s clearly convinced by (and the results he keeps pointing to) are Akida, then… why not say Akida?

Especially because, realistically, there isn’t a true “Akida-equivalent” out there in the same sense. So when someone writes “like Akida,” it raises a fair question:

What other systems does he actually mean?
What has he achieved similar results with, if not Akida?
Is this just hedging language, and if yes…why?

No hate at all …. I’m genuinely trying to understand the wording. If the claim is basically “right now only Akida can do this (at least in his experience),” then saying “like Akida” sounds unnecessarily vague.
So yeah: I’m just asking … what are people afraid of when they use “like Akida” instead of simply saying “Akida”?

british GIF by Late Night with Seth Meyers
Hi Fur.
I think this guy has been pretty clear and emphatic in both his naming of Akida and his endorsement.
Why would you be wanting to cast any doubt on the matter?
 
  • Like
  • Fire
  • Love
Reactions: 12 users

manny100

Top 20
Hi Fur.
I think this guy has been pretty clear and emphatic in both his naming of Akida and his endorsement.
Why would you be wanting to cast any doubt on the matter?
Agree, Kevin D Johnson is pumping his own tyres up being early on Neuromorphic AKIDA.
As Sean used to say no one wants to be the first to take the plunge.
You need 'cods' the size of water melons to be early with never seen before tech.
He may be in demand when big players get on board AKIDA.
Once we get a couple on board the only danger is getting crushed in the rush.
He is doing a great marketing job for Brainchip.
He makes it sound like plug and play.
Keep it up Kev.
 
  • Like
  • Fire
  • Love
Reactions: 14 users

7für7

Top 20
Hi Fur.
I think this guy has been pretty clear and emphatic in both his naming of Akida and his endorsement.
Why would you be wanting to cast any doubt on the matter?

It’s a general disappointment at the moment… and the problem is… you can be clear the whole time and say how stunning something is… the only thing what counts are the last words you speaking out…

Customer -“Oh wow this Lamborghini so fast… the speed is just incredible… unbelievable vehicle…… I would buy a car like lambo”

Dealer- “you mean….. you would buy Lamborghini?

Customer- “No no… I mean a car like Lamborghini… thanks for the test drive”
 
Last edited:
  • Like
Reactions: 2 users

HopalongPetrovski

I'm Spartacus!
It’s a general disappointment at the moment… and the problem is… you can be clear the whole time and say how stunning something is… the only thing what counts is the last words you speaking out…

Customer -“Oh wow this Lamborghini so fast… the speed is just incredible… unbelievable vehicle…… I would buy a car like lambo”

Dealer- “you mean….. you would buy Lamborghini?

Customer- “No no… I mean a car like Lamborghini… thanks for the test drive”
Maybe you're just bored.
I think you're trying to make something out of nothing. 🤣
 
  • Like
  • Love
  • Fire
Reactions: 9 users
Gearing up for the Raytheon AVC Challenge :)




Xplorer07/Bison_VisionPublic

Autonomous UAV-UGV Collaboration System. A distributed ROS 2 architecture for "Operation Touchdown" (Raytheon AVC 2026). Features GPS-denied navigation (VIO), neuromorphic edge AI (BrainChip Akida) for obstacle mapping, and precision autonomous landing on a moving ground vehicle.

Bison Vision Training​

A clean, professional YOLOv2 training pipeline targeting BrainChip Akida1000 neuromorphic processors, containerized with Docker for NVIDIA GPU acceleration.

Purpose​

This repository provides a complete training pipeline for YOLOv2 object detection models optimized for deployment on BrainChip Akida1000 hardware. The pipeline includes training, evaluation, and conversion to Akida-compatible formats, all within a reproducible Docker environment.

Training Pipeline​

Dataset → Training → Evaluation → Akida Conversion


  1. Dataset Preparation: Prepare your dataset in the expected format (see docs/training_guide.md)
  2. Training: Train YOLOv2 model on NVIDIA GPUs with configurable hyperparameters
  3. Evaluation: Validate model performance on test sets
  4. Akida Conversion: Convert trained model to Akida-compatible format (.fbz) for deployment

Repository Structure​

bison-vision-training/
├── README.md # This file
├── .gitignore # Git ignore rules
├── Dockerfile # Docker environment definition
├── scripts/
│ └── train_yolov2_akida.py # Main training + evaluation + conversion script
├── configs/
│ └── yolov2_akida.yaml # Training configuration (classes, hyperparameters)
├── docs/
│ └── training_guide.md # Detailed training documentation
└── docker/
└── README.md # Docker build and usage instructions


Expected Local Directories​

This repository expects the following directories to exist on your local machine (outside the repository):

  • ~/datasets/: Training and validation datasets
  • ~/experiments/: Training outputs, checkpoints, logs, and converted models
These directories will be mounted into the Docker container during training.

Quick Start​

Prerequisites​

  • Docker installed on your system
  • NVIDIA GPU with CUDA support
  • NVIDIA Container Toolkit installed (installation guide)
  • Dataset in COCO format (assumed to be in ~/datasets but you can adjust as needed)
  • Directory on host for training output and converted models (assumed to be ~/experiments but you can adjust as needed)

Load Docker Image​

# Load the training image
docker load -i yolov2-training.tar

Run Container​

# Run with GPU support
docker run --gpus all -it \
-v ~/datasets:/datasets \
-v ~/experiments:/experiments \
-v $(pwd):/workspace \
yolov2-training:latest bash

# Inside container, verify setup
python -c "import torch; print(f'PyTorch: {torch.__version__}')"
python -c "import torch; print(f'CUDA available: {torch.cuda.is_available()}')"

Run Training​

Once inside the container, run training with your dataset:

# Basic usage - provide your dataset directory
python scripts/train_yolov2_akida.py --data-dir /datasets/your_dataset

# With absolute path
python scripts/train_yolov2_akida.py --data-dir /home/user/datasets/coco

# With relative path (if mounted differently)
python scripts/train_yolov2_akida.py --data-dir ../my_dataset

All data and outputs must be stored in external directories (e.g., ~/datasets, ~/experiments) and mounted into the container at runtime.

Documentation​

  • Training Guide: Detailed explanation of dataset format, training process, evaluation metrics, and Akida conversion
  • Docker Guide: Instructions for building, loading, and running the Docker container

Hardware Requirements​

  • Training: NVIDIA GPU with CUDA support (recommended: 8GB+ VRAM)
  • Deployment: BrainChip Akida1000 neuromorphic processor
 
  • Like
  • Fire
  • Love
Reactions: 4 users

jrp173

Regular
Hi Dio, i remember Tony Lewis said with Gen 2 a bright student could invent something at home.
Kevin D. Johnson of IBM bought an AKIDA 1000 M.2.
It's interesting that he can apparently just plug into Symphony on a Monday and off it goes.
I am aware he is practically an AI genius but i wonder whether he would have 'quietly' needed some prior advice from BRN?
The way its come out on his linked in post makes it sound so easy and is great marketing for both IBM and Brainchip.
Especially great marketing for Brainchip as it cannot be seen as company ramping.
I think that IBM will be very good for Brainchip deal wise.
Why do you throw the old "ramping" bs out there?

It is purely an excuse from BrainChip for their appalling IR.

The old ramping excuse is getting a long in the tooth!
 

Xray1

Regular
I think the share price fairly represents where we are in terms of revenue. If I were completely honest given the dilution with the CR and LDA it may even be construed as a little high.
Now the other part of your question - do I like it? Absolutely not, but I have clearly expressed where I hope it will be in the next two years, free from emotion, which I am in the lucky situation to be in. I understand many, perhaps most are not at the buy-in levels that I have. If it does drop further, which I sincerely hope it does not, I will buy more, based upon my earlier analysis.
I guess the bit that everybody focuses on is - does the share price suggest we should call for the removal of the CEO and the board. In my opinion, absolutely not. They cannot control the share price directly. What matters now is revenue, and relatively substantial revenue, in the 10's of millions to start with. That will move the SP. As smoothsailing stated a consumer customer or a substantial IP or chip contract will give us a significant re-rate. That is what the CEO, Board etc are and need to be focussing on, as well as maintaining our lead and opening up new markets through technological advancement. It's really a fairly simple equation.
Thanks for your question - loaded though it may have been.
Many thanks for your return here and for your open personal /informational posts about the current status and concerns concerning BRN .

On a side note:

I personally however, as a s/holder do strongly beleive, that what we really need in due course is for "Peter V D Made" to personally attend upon the next upcoming AGM and personally address S/holders about how he perceives where things are at with his technology and the Co's management generally as a whole, as a means of lowering S/holder tensions and hopfully giving some sort of informative validation as to where things are actually heading . I think that, such a move on his behalf would have a far greater impact on the AGM and S/holder IR/PR relations rather than what the CEO and Chairman could effectively communicate, if previous AGM's are anything to go by as a standard.
 
  • Like
Reactions: 2 users

Bravo

Meow Meow 🐾
“Neuromorphic compute like Akida saves on scarce resources like power while extending what can be done in line with Symphony's capabilities.
On the IBM question, I can't speak to IBM's product plans. The market typically drives that conversation.”

Why can’t people just commit and call it what it is?
If he says “like Akida,” but the thing he’s clearly convinced by (and the results he keeps pointing to) are Akida, then… why not say Akida?

Especially because, realistically, there isn’t a true “Akida-equivalent” out there in the same sense. So when someone writes “like Akida,” it raises a fair question:

What other systems does he actually mean?
What has he achieved similar results with, if not Akida?
Is this just hedging language, and if yes…why?

No hate at all …. I’m genuinely trying to understand the wording. If the claim is basically “right now only Akida can do this (at least in his experience),” then saying “like Akida” sounds unnecessarily vague.
So yeah: I’m just asking … what are people afraid of when they use “like Akida” instead of simply saying “Akida”?

british GIF by Late Night with Seth Meyers


Sounds like Kevin is being very careful not to imply certain things on IBM's behalf, which is understandable. If he says “only Akida” it shifts the conversation from architecture to vendor endorsement, which he seems to be deliberately avoiding.

Having said that, he’s very explicit and forthcoming about Akida’s performance and power numbers, and he repeatedly names Akida as the tool that proved right for this workload.

What strikes me most is what Kevin has to say about IBM's plans in terms of potential productisation. When he says “On the IBM question, I can't speak to IBM's product plans. The market drives that conversation". I think he's making it pretty clear that IBM won’t commit to product plans until customers start demanding it.

And that’s where it seems like a bit of a stalemate type situation to me. "Stalemate" is probably not the right word I'm looking for, rather that it highlights a real gap that now needs to be bridged.

BrainChip presumably can’t rely on a Principle Technical Specialist from IBM to sell its technology. Equally, IBM isn’t going to productise or promote it unless customers are already asking for it. And customers generally won’t ask for something they don’t clearly understand, can’t easily replicate, or haven’t seen deployed commercially elsewhere.

So, maybe the next step is for BrainChip to develop this into something repeatable and commercially legible under its own steam, by
documenting a clear reference architecture that others can reproduce. Then start demoing it and promoting it, so customers know it exists, which in time might lead to paying customers and eventual revenue.

In my view, Kevin has already done BrainChip a significant favour by validating a real, enterprise-relevant use case inside a credible production environment, backed by impressive numbers.

From here, the outcome may depend far more on BrainChip’s follow-through than on IBM’s intent.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 5 users

manny100

Top 20
Why do you throw the old "ramping" bs out there?

It is purely an excuse from BrainChip for their appalling IR.

The old ramping excuse is getting a long in the tooth!
Posters complain no end concerning lack of communication. If you bother reading the IR release you will be clear on what is regarded as ramping. BRN cannot discuss client details that have not already been released to the public.
For example Sean can say we have engagements underway but cannot provide more detail than that. EG mentioning names, business types are not acceptable
Its quite clear if Kevin D. Johnson of IBM has been 'talking' AKIDA for a number of days up which effectively is 3RD party ramping which is allowed under ASX guidelines. Hae is ramping himself, IBM and AKIDA and that is good.
 
  • Like
Reactions: 1 users

manny100

Top 20
Sounds like Kevin is being very careful not to imply certain things on IBM's behalf, which is understandable. If he says “only Akida” it shifts the conversation from architecture to vendor endorsement, which he seems to be deliberately avoiding.

Having said that, he’s very explicit and forthcoming about Akida’s performance and power numbers, and he repeatedly names Akida as the tool that proved right for this workload.

What strikes me most is what Kevin has to say about IBM's plans in terms of potential productisation. When he says “On the IBM question, I can't speak to IBM's product plans. The market drives that conversation". I think he's making it pretty clear that IBM won’t commit to product plans until customers start demanding it.

And that’s where it seems like a bit of a stalemate type situation to me. "Stalemate" is probably not the right word I'm looking for, rather that it highlights a real gap that now needs to be bridged.

BrainChip presumably can’t rely on a Principle Technical Specialist from IBM to sell its technology. Equally, IBM isn’t going to productise or promote it unless customers are already asking for it. And customers generally won’t ask for something they don’t clearly understand, can’t easily replicate, or haven’t seen deployed commercially elsewhere.

So, maybe the next step is for BrainChip to develop this into something repeatable and commercially legible under its own steam, by
documenting a clear reference architecture that others can reproduce. Then start demoing it and promoting it, so customers know it exists, which in time might lead to paying customers and eventual revenue.

In my view, Kevin has already done BrainChip a significant favour by validating a real, enterprise-relevant use case inside a credible production environment, backed by impressive numbers.

From here, the outcome may depend far more on BrainChip’s follow-through than on IBM’s intent.
IBM sales and marketing will likely try to push the improved performance, power savings and ROI that Kevin talks about to clients.
Kevin makes it sound like plug and play.
Package delivered on Friday and plug it in on Monday.
I suspect it may not be that easy and he may have some support from BRN.
 
  • Like
Reactions: 2 users
Top Bottom