BRN Discussion Ongoing

Diogenese

Top 20
A little profile on Nikunj Kotecha.....our former Senior Solutions Architect until Nov 2023.......some little titbits in there on automotive and earables


But, apart from:

Akida,
TENNs.
AV models,
MetaTF,
benchmarking tools,
transformers,
Edge impulse,
Edge Box,
IP licensing,
strategic partnerships,
in cabin functionality,
model quantizing and fine tuning,
workshop leadership,
application notes,

what else has Nikunj done for Brainchip?
 
  • Haha
  • Like
  • Fire
Reactions: 30 users
A little profile on Nikunj Kotecha.....our former Senior Solutions Architect until Nov 2023.......some little titbits in there on automotive and earables


Interesting article: Did Nikunj just release a new partnership? Has Space Machines Company been mentioned before?

“He also contributed to BrainChip’s partnership with Space Machines Company where he assisted with model quantization and fine-tuning of the model architecture for improved efficiency in harsh environments.”

 
  • Like
  • Thinking
  • Fire
Reactions: 18 users
Interesting article: Did Nikunj just release a new partnership? Has Space Machines Company been mentioned before?

“He also contributed to BrainChip’s partnership with Space Machines Company where he assisted with model quantization and fine-tuning of the model architecture for improved efficiency in harsh environments.”

Think you'll find it is the Ant61 thing.

SC
 
  • Like
  • Thinking
Reactions: 5 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Interesting article: Did Nikunj just release a new partnership? Has Space Machines Company been mentioned before?

“He also contributed to BrainChip’s partnership with Space Machines Company where he assisted with model quantization and fine-tuning of the model architecture for improved efficiency in harsh environments.”

Space Machines built the Optimus-1 spacecraft with ANT61 on board which had Akida integrated into it
 
  • Like
  • Fire
Reactions: 3 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Think you'll find it is the Ant61 thing.

SC
Beat me to it @Space Cadet ....

d2b4f872-917d-42c0-83e1-7f2423226ffe_text.gif


Happy as Larry
 
  • Haha
  • Like
  • Wow
Reactions: 4 users

TheDrooben

Pretty Pretty Pretty Pretty Good
  • Haha
Reactions: 8 users

7für7

Top 20
But, apart from:

Akida,
TENNs.
AV models,
MetaTF,
benchmarking tools,
transformers,
Edge impulse,
Edge Box,
IP licensing,
strategic partnerships,
in cabin functionality,
model quantizing and fine tuning,
workshop leadership,
application notes,

what else has Nikunj done for Brainchip?
HE TOOK SOMEONES JOOOOB

1736577005163.gif
 
  • Haha
Reactions: 1 users

7für7

Top 20
Did someone post this already? Maybe I missed it! 🤔

 
  • Fire
  • Thinking
  • Love
Reactions: 4 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Our illustrious leader, Sean Hehir participated in a video session with the Industrial Technology Research Institute (ITRI).

Intriguingly there was an article published online about 16 hours ago which describes how the ITRI was at CES, showing off a new AI-driven trainer that uses image analysis and data inference to improve an athlete’s performance.

I wonder if Sean got to polish up on his badminton skills? But more importantly, I wonder if we're involved in this AI athlete training system in some way, shape or form?


Screenshot 2025-01-11 at 5.31.45 pm.png






AI taught me to be a (slightly) better badminton player at CES

The Industrial Technology Research Institute was in Las Vegas to show off a new AI-driven trainer that uses image analysis and data inference to improve an athlete’s performance. The sport that served as our demonstration? Badminton.
AI taught me to be a (slightly) better badminton player at CES

US badminton Olympian Howard Shu plays with ITRI’s AI Badminton Trainer at CES 2025. [Photo: ITRI]
https://www.linkedin.com/shareArticle?mini=true&url=https://www.fastcompany.com/91258203/ai-taught-me-to-be-a-slightly-better-badminton-player-at-ces&media=AI+taught+me+to+be+a+(slightly)+better+badminton+player+at+CES
BY Chris Morris3 minute read
I am not what you would call a finely tuned athletic machine. I am, if anything, an outdated lawnmower engine held together by duct tape and rust. So when I was offered the opportunity to let AI help make me a better athlete at CES, I jumped at the opportunity.
The offer came from the Industrial Technology Research Institute (ITRI), a non-profit that uses applied research to drive industrial development. They were showing off a new AI-driven trainer that uses image analysis and data inference to improve an athlete’s performance. The sport that was the focus of this particular demonstration? Badminton.
The upside, I thought, would be that no one at CES would be especially good at badminton—so it wouldn’t be as humbling as it would if the system was tracking, say, my golf swing. Then Howard Shu strolled by.
Shu is a member of the U. S. Olympic badminton team and has been playing the sport for 26 years. He’s tall, in remarkable shape, and knows how to make a shuttlecock do pretty much whatever he wants. He is, in other words, the antithesis of me. We’ll get back to him in a moment, though.
i-2-CES_2025_AI_Badminton_C.jpg
[Photo: ITRI]
To get a sense of my abilities, the training tool used a series of cameras to track my stance, swing, and other movements over five volleys. The data was fed into a Generative AI system, which instantly offered recommendations. At the same time, information like the speed of my volleys, the height with which they cleared the net, and where they landed on the court was captured and factored in as well.
I thought I was doing okay, honestly. I whacked the shuttlecock at 52 miles per hour, cleared the net by two feet, and made my invisible opponent chase it around the court. Then I walked over to see what the AI had to say.
“Ah, my friend!,” it wrote. “It looks like we’ve got a bit of a situation here.”


Maybe I hadn’t done quite as well as I thought I had.
The AI (which called itself Bill) noted that I was standing too close to the shuttlecock, which limited my ability to reach for the shot. Also, I needed to work on my weight transfer and balance. My footwork was not exactly ideal either.
And while it had my attention, Bill noted that my grip on the racket “might not be ideal for controlling the shuttlecock effectively or generating power” with my shots. And my follow-through was “abrupt.”
advertisement



Basically, the machine told me, I suck.
That’s when Shu took a turn. His speeds were closer to 80 mph—and he tightly grouped the shots. (He later told me he felt the system’s speed detection needed some calibrating as he normally hits faster than that.)
I gave the system one more try, with Shu suggesting I stand in a different spot on the court—and while my shots weren’t as powerful, they were much more tightly grouped. I won’t be threatening Shu’s spot on the Olympic team anytime soon, but I could be more of a beast at the local recreation center.
i-CES_2025_AI_Badminton_A.jpg
[Photo: ITRI]
This training tool is already used by the Taiwan Olympic badminton team, an ITRI representative told me. Shu said it was the first time he had had an opportunity to try it—adding that he expects a growing number of athletes will begin to incorporate AI into their training.
“It’s able to pick up things you’re not able to pick up with the naked eye,” he said. “I can tell you my smash is fast, but I’m not going to be able to tell you the exact speed. You’re able to dial in exact numbers and get data driven results. As high performing athletes, we’re always trying to find that 1% advantage.”
Bill, I should note, remained unimpressed with my performance on the court.
Shu might be looking for a 1% advantage. I’d settle for the AI being a bit less judgmental.

 
  • Like
  • Fire
  • Love
Reactions: 28 users
But, apart from:

Akida,
TENNs.
AV models,
MetaTF,
benchmarking tools,
transformers,
Edge impulse,
Edge Box,
IP licensing,
strategic partnerships,
in cabin functionality,
model quantizing and fine tuning,
workshop leadership,
application notes,

what else has Nikunj done for Brainchip?
Bit of a loss that one....would have been great if he'd stayed imo.

Btw....was looking at the M.2 update and curious on your thoughts if you can oblige?

Do you think we have a stack of M.2s available from the first AKD1000 run as well as the PCIe and we just essentially haven't released it to mkt but now is the time for added flexibility / requests for users / researchers?

Reason is I was just reading an old article (late 2020) on Anils presso to an industry group back in 2020 and the article includes the below image of our AKD1000 M.2 board which implies we had it back in 2020?

Screen-Shot-2020-11-19-at-10.40.01-AM.png



 
  • Like
  • Wow
  • Thinking
Reactions: 15 users

Diogenese

Top 20
Bit of a loss that one....would have been great if he'd stayed imo.

Btw....was looking at the M.2 update and curious on your thoughts if you can oblige?

Do you think we have a stack of M.2s available from the first AKD1000 run as well as the PCIe and we just essentially haven't released it to mkt but now is the time for added flexibility / requests for users / researchers?

Reason is I was just reading an old article (late 2020) on Anils presso to an industry group back in 2020 and the article includes the below image of our AKD1000 M.2 board which implies we had it back in 2020?

View attachment 75767


Hi FMF,

That maybe a FPGA test chip.

https://www.eejournal.com/article/brainchip-debuts-neuromorphic-chip/
...
Brainchip’s test chip benchmarking results are impressive – claiming 1,100 fps with the $10 chip on CIFAR-10 with 82% accuracy, using less than 0.2 Watts – about 6K fps/watt. The company says this compares with 83% at 6K fps/watt from IBM’s “True North” (at a cost of around $1K) and 80% from Xilinx ZC709 at 6K fps/watt (also around $1K).
...
Brainchip’s development environment is available in Q3 2018, and should be accessible to an FPGA-based acceleration board in advance of availability of the Akida chip in 2019.

I'm constantly amazed at hoe BRN keep producing Akida 1000's out of their hat. I don't know where the Unigen, BH, VVDN, etc., chips will come from.

Pretty sure the December 2020 date was overoptimistic.
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 14 users

manny100

Top 20
Think of a timeline....12 months, 24 months, 36 months, with the way life and all things AI are or seem to be continually accelerating,
where would you like to be positioning yourself, at the Data Centre or it the Edge.

Now, currently there's a place for both types of approaches, but one is based on an aging architecture, lets think of bottlenecks, lets
think of power consumption, lets think of critical life changing scenarios where real time answers are needed in "real time" not where the
latency becomes a major issue, or internet outages become a major issue or bandwidth becomes a major issue....the companies that talk
about the Edge, are they "really" treating the Edge (the future) with the respect that it is deserving of ?

We are clearly, in my opinion, so far out in front, (maybe an exaggeration) that when we have Senior Executives (Steve Brightfield) who
know this industry backwards say that he feels within 5 years we, Brainchip will be in every product, well one has to stop and think, is that
the salesman/marketer speaking or is he really saying, listen up Nvidia and Co, we are the company that is going to be lighting up the future
with all our connected devices, all our intelligence and continuous learning on device, and so if there is going to be a future conduit to this
massive technology revolution, well, we Brainchip currently hold the "Key (singular) to the Kingdom...come and let's talk, only condition that
we ask is that you leave your egos at the reception area, thanks.

Go AKIDA...we all love your brilliance !

Tech.
Agree TECH, its taken a lot of time and patience but we are closing on deals and the Edge is getting closer to widespread adoption.
Ironically as we and the Edge are closing in on success cloud based NVIDIA is making record highs.
It just shows that big changes in IT move at tortoise pace. This is actually good for us as when we are 'flying' new tech will have to wait like we did while we are making record highs.
NVIDIA has 25 billion shares and we have 2 billion on issue. If NVIDIA had only 2 bill SOI then it would be trading at circa $US1735. Puts our LDA plans into perspective.
 
  • Like
  • Fire
  • Wow
Reactions: 20 users
Hi FMF,

That's a 20 node Akida, so maybe a FPGA test chip. Certainly not the Akida 1000.

https://www.eejournal.com/article/brainchip-debuts-neuromorphic-chip/
...
Brainchip’s test chip benchmarking results are impressive – claiming 1,100 fps with the $10 chip on CIFAR-10 with 82% accuracy, using less than 0.2 Watts – about 6K fps/watt. The company says this compares with 83% at 6K fps/watt from IBM’s “True North” (at a cost of around $1K) and 80% from Xilinx ZC709 at 6K fps/watt (also around $1K).
...
Brainchip’s development environment is available in Q3 2018, and should be accessible to an FPGA-based acceleration board in advance of availability of the Akida chip in 2019.

I'm constantly amazed at hoe BRN keep producing Akida 1000's out of their hat. I don't know where the Unigen, BH, VVDN, etc., chips will come from.

Pretty sure the December 2020 date was overoptimistic.
Thanks for that.

20 nodes hey....wonder what the writer would think now of 80 haha
 
  • Like
Reactions: 4 users

manny100

Top 20
Hi FMF,

That's a 20 node Akida, so maybe a FPGA test chip. Certainly not the Akida 1000.

https://www.eejournal.com/article/brainchip-debuts-neuromorphic-chip/
...
Brainchip’s test chip benchmarking results are impressive – claiming 1,100 fps with the $10 chip on CIFAR-10 with 82% accuracy, using less than 0.2 Watts – about 6K fps/watt. The company says this compares with 83% at 6K fps/watt from IBM’s “True North” (at a cost of around $1K) and 80% from Xilinx ZC709 at 6K fps/watt (also around $1K).
...
Brainchip’s development environment is available in Q3 2018, and should be accessible to an FPGA-based acceleration board in advance of availability of the Akida chip in 2019.

I'm constantly amazed at hoe BRN keep producing Akida 1000's out of their hat. I don't know where the Unigen, BH, VVDN, etc., chips will come from.

Pretty sure the December 2020 date was overoptimistic.
Unfortunately all chips look the same to me.😁
 
  • Haha
Reactions: 3 users

7für7

Top 20
If I think about it how bill eichen explained how easy it is to implement akida into existing systems using their platform… (short version) I don’t think we will need to wait too long any more… just my opinion (finger crossed)
 
  • Like
Reactions: 14 users

Diogenese

Top 20
Unfortunately all chips look the same to me.😁
Yes. I recall Sean had something to say about the naming convention - or maybe they just had a whole lot of Akida 1000 caps left over.
 
  • Like
Reactions: 3 users

FuzM

Member
  • Like
Reactions: 3 users

7für7

Top 20
Seems like an old podcast from March 2, 2022.
View attachment 75781
I haven’t listened to it yet… but it would make sense if it’s older. I was wondering anyway why Rob would talk about BrainChip in a podcast… I’m just wondering why the date is current. Thanks for the correction.
 
  • Like
Reactions: 1 users
Chetan Kadway’s Post

View profile for Chetan Kadway, graphic
Chetan Kadway
Data Scientist | ML Researcher | Computer Vision | SpaceTech | Automotive | Edge AI | Neuromorphic Computing | Azure ML | ML-Ops

8mo Edited

Last week, I had the incredible opportunity to visit the Netherlands (with my colleague Sounak Dey) on a business trip that turned out to be much more than I expected.

Primary purpose of the visit was to present our research work (as invited speakers) at Morpheus Edge AI & Neuromorphic Computing workshop (https://lnkd.in/g-z-dAxJ) organised by Laurent Hili at ESA/ESTEC (European Space Research & Technology Centre). It was well rounded workshop where we got to meet SpaceTech stakeholders that a innnovating towards embedding intelligence onboard satellites (startups, soc manufacturers, european university researchers, ESA project leaders, etc). Excellent work by workshop organiser and the presenting participants.

Unexpectedly, I also met Sir Ananth Krishnan (Tata Consultancy Services Ex-CTO) on his post retirement holiday. And the first thing that came to my mind was his retirement speech address at our IIT-KGP Research Park office: He motivated us to make products & services that are 1] Usable (should pass Grandma test) 2] Trustworthy/Reliable 3] Frugal (Space & Time resource efficient, think of bits & bytes). His advice still guides us in our research work.

The research work we presented was done in colllboration with our friends at BrainChip (Gilles Bézard and Alf Kuchenbuch). Brief introduction to our work that got deployed on Brainchip AKIDA neuromorphic processor:

Currently, there is a delay of many hours or even days to draw actionable insights from satellite imagery (due to mismatch between data volume acquisition & limited comms bandwidth). We observed that end-users either need RAW images or Analytics-ready meta-data as soon as possible. Therefore, embedding intelligence (Edge AI) onboard satellites can result in quicker business decision making across business verticals that rely on geo-spatial data.

To address this, guided by the foresight of Dr. Arpan Pal we built a bundle of tech capabilities that helps send RAW data & Analytics-ready meta-data as soon as possible to ground station. These tech capabilites include:
1) Cloud Cover Detection Model (high-accuracy, low-latency, low-power).
2) DL based Lossless Compression (around 45% compression ratio).
3) RL based Neural Architecture Search Algorithm (quickly search data+task+hardware specific optimal DL models).

We also had a chance to visit TCS Paceport in Amsterdam, hoping to showcase our research prototype there soon. Looking forward to more future collaborations with Edge AI/Neuromorphic hardware accelelator designers & space-grade SoC manufacturers.

Would like to thank Tata Consultancy Services - Research for such great opportunity to build future tech for future demand. Would also like to thank our Edge & Neuromorphic team: Arijit Mukherjee, Sounak Dey, Swarnava Dey, Abhishek Roy Choudhury, Shalini Mukhopadhyay, Syed Mujibul Islam, Sayan Kahali and our academic research advisor Dr. Manan Suri.

#spacetech #satellite #edgecomputing #orbital #AI #neuromorphic #SNN #AKIDA #TCS

https://www.linkedin.com/posts/chet...gecomputing-activity-7195083124312088577-SO7P

SC
 
  • Like
  • Fire
  • Love
Reactions: 56 users
Top Bottom