BRN Discussion Ongoing

I feel very sorry for those who put BRN into their SMSF ....... :( :(
Yeah im 1 of them , to do so and then for antonio to say where going to the American would of cost me plenty as my supwr fund doesnt deal with other markets
 
I don’t know if it’s your first stock, but my experience is , if the share price is at that level and there is no turnover, you can virtually or life at the AGM scream how much you want…they wouldn’t care less… I was invested in one company, it was one of the biggest companies in their industry… if not the biggest… they went bankrupt because of bad management and fake numbers … people was crying at AGM and talking with shaking voice… on the end a lot of people lost big money… some even their hous and savings…

Former CEO committed suicide ….

Just saying … we are just puppets
Very good post
 

Rach2512

Regular
Interesting to see that Kevin reposted the SpaceX post.

Do you think there will come a point where companies will boast that Akida's inside?





Screenshot_20260205_180606_Samsung Internet.jpg
Screenshot_20260205_180617_Samsung Internet.jpg

Screenshot_20260205_180626_Samsung Internet.jpg
Screenshot_20260205_180636_Samsung Internet.jpg
Screenshot_20260205_181346_Samsung Internet.jpg


Screenshot_20260205_183356_Samsung Internet.jpg
Screenshot_20260205_183328_Samsung Internet.jpg

Screenshot_20260205_183338_Samsung Internet.jpg
 
Last edited:
  • Love
  • Like
  • Fire
Reactions: 11 users
Quick question for those on LinkedIn (as I am not) have any Brainchip staff commented or liked any of the recent postings by Kevin D. Johnson?
💯 % they have and it's all fantastic on the low low.
 
  • Love
  • Like
Reactions: 2 users

Mt09

Regular
1770286609057.jpeg
 

Attachments

  • 1770286712843.png
    1770286712843.png
    1.1 MB · Views: 26
  • Fire
  • Like
  • Love
Reactions: 7 users
💯 % they have and it's all fantastic on the low low.
Thank you SS and MT. Hopefully the low key response from the company acccount rather than individual employees is a sign of things to come. Perhaps given its IBM and what’s potentially at stake a directive to all staff for radio silence? Fingers crossed hey, god knows we need something to stick.
 
  • Like
Reactions: 3 users
New Tata patent just published.

Akida mentioned throughout when discussing their SNN invention.

Patent HERE


IMG_20260205_182954.jpg
 
  • Like
  • Fire
  • Love
Reactions: 21 users
Thank you SS and MT. Hopefully the low key response from the company acccount rather than individual employees is a sign of things to come. Perhaps given its IBM and what’s potentially at stake a directive to all staff for radio silence? Fingers crossed hey, god knows we need something to stick.
I hear you, Hanging there butter cup we'll get there, deep breaths.
 
Last edited:
  • Like
Reactions: 3 users

manny100

Top 20
From the recent Steve Brightfield interview. His predictions:
"But do your question. I think we're trying to ride the neuromorphic computing and brain chip in particular is trying to ride the coattails of the overall market moving to the edge. And when we look at market research reports from companies, they're saying about 10% of these edge products embedded devices are running some AI software on them. But within the next four years, four to five years, 30 to 35% of those products will have AI on. And I think if we look out, the next five years, 90% of them will have it all embedded in it. And there will be a neuromorphic computing in probably half of those devices. "
My bold above.
 
  • Like
  • Fire
Reactions: 8 users

manny100

Top 20
Steve Brightfield provided an insight in what Brainchip is doing with Ear Buds/Hearing Aids.
" We're using neuromorphics for the input signaling because of the advantages of this far state, but when we get to some of the large language models, we don't use the neuromorphic algorithms, we use the state space models. And we're looking at combining those two into a single, you know, platform so that you get the enhancement of the signal going into the speech recognition, the speech recognition with the LM can predict what next word is being said and improve the accuracy of the recognition. And then you can have a local, large language model in your earpiece that could have maybe a very limited set of information in it, but it's what you need, right? One of the interesting use cases is a memory LLM for an old person. And you know, it would say, "Oh, your granddaughter's name is Shelley and she's four years old." And so that when you can, if you don't forget this stuff, boom, you have it, right? And you just need to some cues some time to get your memory back. And this is one of those interesting things that we're like the Nationalist to Health says, "This could be really good because that helps solve a lot of these issues with, if you can't hear well, you start having dementia and you start having, you know, problems, cognition problems, right? It's very important to have, you know, hearing insight to keep yourself, your brain healthy because your brain, that's what it's doing. It's constantly processing those signals. "
 
  • Like
  • Fire
Reactions: 2 users

manny100

Top 20
A little bit more from Steve Brightfield.
Radar new capabilities in recognising objects (not possible with traditional radar) and how this is transferable to robotics and that Brainchip is in discussions with robotics companies.
He introduces the concept of braintags - we have all heard of airtags - i did not include the detail of that in the quote below as it was starting to get to long. Basically the BrainTag is an AirTag with a tiny AI brain inside. It can listen, recognise sounds, and understand events using an ultra‑low‑power neuromorphic chip, all without sending audio to the cloud.

" What's around the corner or over the horizon? One of the things that was interesting is we got a contract with Air Force research libraries to work on radar using these algorithms, right? And the results actually surprised us and they surprised the contracting agency and now we're expanding that. And we think that we can, you know, add capabilities to radar that weren't there before. Like, for example, radar can detect things, right? But it can't tell you what it is. Well, we can classify objects now with radar in addition to detecting them. We can improve the tracking and the latency of these radars. But we can also make them a lot smaller, right? So it's that size weight and power. Can I put a radar in a robot? So when it's hand has got a radar signal in it and it can basically navigate, you can paint the scene without a camera. You can use it like a camera to paint the scene and recognize and grasp things that a drone. You can fly it inside tunnels or buildings indoors. You can map out where you're going. We see this shrinking of the conventional radar technologies to really go into anything moving because it's all whether it works in the dark. And if it can replicate some of the things in vision, then, you know, you don't have to worry about rain and fog and all the issues that visual, you know, control of robots. Yeah. And are you working with robotic companies or is this still in the research room? It's still in the research. We're working with companies that are creating components or solutions that go to the robotics companies. We are in active conversations with robotic companies today. And they're in evaluation of this, right? But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out. We'll build a little prototype. So we're doing reference designs and radar. We're also going to do this in these wearables. And this is a quite interesting approach. You know, the air tag, right? Yep. What about having a brain tag? It's a smart air tag, right? It's got inferencing on it."
 
Last edited:
  • Like
  • Fire
Reactions: 4 users

manny100

Top 20
You can gather from the above posts that Steve Brightfield as CMO gets paid to spread the 'behind the scenes' news.
Sean as CEO in contrast will not disclose specifics of anything that has not already been publicly released.
 
  • Fire
Reactions: 2 users

Diogenese

Top 20
Steve Brightfield provided an insight in what Brainchip is doing with Ear Buds/Hearing Aids.
" We're using neuromorphics for the input signaling because of the advantages of this far state, but when we get to some of the large language models, we don't use the neuromorphic algorithms, we use the state space models. And we're looking at combining those two into a single, you know, platform so that you get the enhancement of the signal going into the speech recognition, the speech recognition with the LM can predict what next word is being said and improve the accuracy of the recognition. And then you can have a local, large language model in your earpiece that could have maybe a very limited set of information in it, but it's what you need, right? One of the interesting use cases is a memory LLM for an old person. And you know, it would say, "Oh, your granddaughter's name is Shelley and she's four years old." And so that when you can, if you don't forget this stuff, boom, you have it, right? And you just need to some cues some time to get your memory back. And this is one of those interesting things that we're like the Nationalist to Health says, "This could be really good because that helps solve a lot of these issues with, if you can't hear well, you start having dementia and you start having, you know, problems, cognition problems, right? It's very important to have, you know, hearing insight to keep yourself, your brain healthy because your brain, that's what it's doing. It's constantly processing those signals. "
Hi Manny,

Some months ago Tony Lewis made an offhand remark which indicated that TENNs state space model was different from Akida NN, and here is a clearer statement from Steve:

"We're using neuromorphics for the input signaling because of the advantages of this far state, but when we get to some of the large language models, we don't use the neuromorphic algorithms, we use the state space models. And we're looking at combining those two into a single, you know, platform so that you get the enhancement of the signal going into the speech recognition, the speech recognition with the LM can predict what next word is being said and improve the accuracy of the recognition."

To me, this clarifies that we have 2 different types of AI processors, and this is reflected in our projected product range. Akida 1, 1599, 2 are the original SNN architecture, while Akida 2 and GenAI implement TENNs. SSM. According to Steve, BRN are developing a hybrid SSN/SSM device which takes advantage of the capabilities of both the SNN for speech recognition and the SSM for speech processing.

No doubt there is a patent application in the offing.
 
  • Like
  • Fire
Reactions: 2 users

White Horse

Regular
If the share price is at 14c at time of AGM the red headed rat rooter better have some bodyguards and a same day return trip to the silicon valley
You and quite a few others just don't get it.
Sean does not decide policy, he just implements what the board dictates.
Now Antonio, that's another question.
 
  • Like
Reactions: 1 users

AusEire

Founding Member.
If the share price is at 14c at time of AGM the red headed rat rooter better have some bodyguards and a same day return trip to the silicon valley
Is this a threat?
 
  • Thinking
  • Sad
  • Wow
Reactions: 3 users

Rach2512

Regular
  • Like
Reactions: 1 users

Rach2512

Regular
?


 

Rach2512

Regular
 

The Pope

Regular

The Pope

Regular
Interesting what Rudy has been doing 1 year at Nvidia and Steven Thorne made comment

 
  • Fire
Reactions: 1 users
Top Bottom