BRN Discussion Ongoing

Worth a listen imo. Not sure if prev posted but podcast from a couple of months ago talking to Tony, Jon, Rudy, Kris & Olivier.

A lot on TENNS in its workings and performances and some other snippets.



 
  • Like
  • Fire
  • Love
Reactions: 16 users

equanimous

Norse clairvoyant shapeshifter goddess
When are people's predictions for when we might see share price back up at the previous highs?

For this question its best to ask Ra.

In 2026, the stars will weave a celestial tapestry above the Great Pyramids of Giza, aligning with their ancient geometry during the equinoxes on March 20 and September 22, when day and night balance perfectly. As the sun rises, kissing the pyramids’ edges, Orion’s Belt—Alnitak, Alnilam, and Mintaka—will hover in alignment with the three pyramids, mirroring their layout as seen by the ancients. Sirius, the brightest star, may trace a path echoing the pyramids’ southern shafts, while the equinox sun casts precise shadows, binding earth and cosmos in a timeless celestial embrace.
 
  • Haha
  • Sad
Reactions: 11 users
and before people say impossible to know and so on, I realise that. I am just wondering what people's expectations are? I seriously think it could be there in 2 or 3 years given the sector.


1747935546891.gif
 
  • Haha
Reactions: 4 users
and before people say impossible to know and so on, I realise that. I am just wondering what people's expectations are? I seriously think it could be there in 2 or 3 years given the sector.
In for the longhaul and a US Market listing
 

DK6161

Regular
and before people say impossible to know and so on, I realise that. I am just wondering what people's expectations are? I seriously think it could be there in 2 or 3 years given the sector.
It is imminent. Watch the financials
 
  • Haha
  • Like
Reactions: 2 users

Rach2512

Regular

Nice to see Brainchip next door to our new partner Andes. Check out the last photo.


Screenshot_20250523_070134_Samsung Internet.jpg
Screenshot_20250523_065635_Samsung Internet.jpg
Screenshot_20250523_065622_Samsung Internet.jpg
 
  • Like
  • Love
  • Fire
Reactions: 26 users
  • Like
  • Fire
  • Love
Reactions: 41 users
It is imminent. Watch the financials
In for the longhaul and a US Market listing
and before people say impossible to know and so on, I realise that. I am just wondering what people's expectations are? I seriously think it could be there in 2 or 3 years given the sector.
Hands up if you think these 3 are over here from
hot crapper to put all the negatives post up.

1747960981838.gif
 
  • Like
  • Haha
  • Fire
Reactions: 22 users

7für7

Top 20
  • Haha
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
fingers-crossed-jenny-hagel.gif


I am seriously beginning to wonder if this new AI device being developed by Sam Altman and Jony Ive could operate as an edge device, potentially incorporating neuromorphic processors like our very own Akida.

The upcoming AI device, expected to launch in 2026, is envisioned as a compact, screenless accessory designed to be unobtrusive and context-aware. It's intended to function alongside existing devices like smartphones and laptops, rather than replacing them. This suggests a focus on local processing capabilities. By processing data locally, the device could offer faster responses, enhanced privacy, and reduced reliance on cloud connectivity.

Have a listen to the video which I set to play when Sam Altman Altman explains how we currently interact with ChatGPT and he says something along the lines that the new device "deserves something much better" than having to wait for it to go up to the cloud like traditional devices.




As we all know, Akida's architecture allows for on-device learning and real-time data processing, making it ideal for applications requiring immediate responses without cloud dependency.

The specific technical details about the device are still under wraps, but the emphasis on unobtrusive design, real-time context awareness, and potential for local processing sounds like it aligns very well with the capabilities of BrainChip's Akida IMO.

Maybe just wishful thinking on my behalf. Time will tell.



Screenshot 2025-05-23 at 10.50.08 am.png

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 20 users
Hands up if you think these 3 are over here from
hot crapper to put all the negatives post up.

View attachment 85073
Nah mate I'm being serious,
We never have announcements,
Where going to list in America and delist from the ASX
We have to watch the financials because we hear nothing, Probably 2027 onwards for mine
 

Think this link will work, but anyway, you have to agree that it's great to see Dr. Tony up and about promoting the true benefits that
our company's technology has over the opposition.......can anyone produce revenue figures from any other company (worldwide) whom
specializes in "real edge ai"......I'm sure we aren't alone, as bloody frustrating as it is!

TICKETY TOCKETY............keep flying the flag Dr. Tony........I certainly appreciate it, thank you.

Tech.
Pity Tony Dawe doesn't respond to emails, probably trying to keep his job and saying yes to Sean more often
 
  • Like
  • Wow
Reactions: 2 users
Nah mate I'm being serious,
We never have announcements,
Where going to list in America and delist from the ASX
We have to watch the financials because we hear nothing, Probably 2027 onwards for mine
Well 18 months isn’t that far away unless your a dog or cat 😂
 
  • Haha
  • Like
Reactions: 6 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
View attachment 85074

I am seriously beginning to wonder if this new AI device being developed by Sam Altman and Jony Ive could operate as an edge device, potentially incorporating neuromorphic processors like our very own Akida.

The upcoming AI device, expected to launch in 2026, is envisioned as a compact, screenless accessory designed to be unobtrusive and context-aware. It's intended to function alongside existing devices like smartphones and laptops, rather than replacing them. This suggests a focus on local processing capabilities. By processing data locally, the device could offer faster responses, enhanced privacy, and reduced reliance on cloud connectivity.

Have a listen to the video which I set to play when Sam Altman Altman explains how we currently interact with ChatGPT and he says something along the lines that the new device "deserves something much better" than having to wait for it to go up to the cloud like traditional devices.




As we all know, Akida's architecture allows for on-device learning and real-time data processing, making it ideal for applications requiring immediate responses without cloud dependency.

The specific technical details about the device are still under wraps, but the emphasis on unobtrusive design, real-time context awareness, and potential for local processing sounds like it aligns very well with the capabilities of BrainChip's Akida IMO.

Maybe just wishful thinking on my behalf. Time will tell.



View attachment 85075



Seems ChatGPT agrees that we're in with a pretty good chance too. That is IF Altman and Ive’s new device is meant to operate continuously, privately, and intelligently at the edge without cloud dependence.

The final recommendation states "If the product blends all three functions — which seems likely — BrainChip Akida is the most versatile choice today due to its:
  • Commercial availability
  • On-chip learning
  • Ability to handle both audio and vision"

To be honest though, I think that if we were going to be a contender for this new device, I would have expected to sense much more of an excited and confident "vibe" emanating from the Board at the recent AGM. I mean they're expecting it to ship 100 million of these devices in 2026 if I'm not mistaken.



ChatGPT said:​

Yes, several other emerging technologies could potentially power a compact, privacy-focused, cloud-independent AI device like the one being developed by Sam Altman and Jony Ive, offering similar or complementary capabilities to BrainChip’s Akida neuromorphic processor. Here are the main contenders:

🔹 1. Mythic Analog Matrix Processors (AMP)

  • How it works: Uses analog computing-in-memory to perform neural network operations directly where data is stored, minimizing data movement.
  • Strengths:
    • Very low power consumption
    • High-speed processing suited for edge AI
    • Compact form factor
  • Use case: Ideal for smart sensors, AR/VR glasses, and real-time voice/image processing

🔹 2. GrAI Matter Labs – GrAI VIP

  • Type: Brain-inspired processor using sparsity-driven processing
  • Features:
    • Event-driven computation
    • Prioritizes only relevant data (like Akida)
    • Optimized for real-time AI at the edge
  • Strengths:
    • Ultra-low latency
    • Energy-efficient
  • Use case: Robotics, autonomous systems, wearables

🔹 3. Edge TPUs (Google Coral)

  • Architecture: Custom ASICs optimized for TensorFlow Lite models
  • Strengths:
    • Easy integration with Google AI ecosystem
    • Real-time inferencing with low power draw
  • Limitations:
    • More rigid; not capable of on-device learning (unlike Akida)
    • Better suited for inferencing than training

🔹 4. Syntiant NDP Series (Neural Decision Processors)

  • Type: Deep learning processors designed for always-on applications
  • Strengths:
    • Consumes microwatts of power
    • Highly efficient for audio and sensor processing
    • Tiny form factor (suitable for earbuds, wearables)
  • Use case: Wake-word detection, voice commands, health monitoring

🔹 5. IBM NorthPole

  • Type: Advanced neuromorphic architecture integrating memory and compute
  • Highlights:
    • Breaks von Neumann bottleneck
    • Designed for high-throughput, low-latency AI at edge
  • Status: Still in research/prototype stages

🔹 6. Intel Loihi 2 (Neuromorphic)

  • Advantages:
    • Second-gen neuromorphic chip
    • Supports learning on-chip
    • Better scalability and flexibility than Loihi 1
  • Use case: Real-time adaptation, robotics, industrial applications

🔹 7. Tenstorrent (RISC-V + AI)

  • Founded by Jim Keller, their chips combine traditional processing with AI accelerators.
  • Edge-ready options: Though mostly datacenter-focused now, Tenstorrent has roadmap items for edge deployment using RISC-V-based inference.

Summary Table:​

TechnologyTypeOn-Device LearningPower EfficiencyMaturitySuitable Use Case
BrainChip AkidaNeuromorphic✅✅CommercialAudio, vision, always-on inference
Mythic AMPAnalog compute❌✅Early stageVision, industrial sensors
GrAI VIPSparse compute✅✅Pre-commercialReal-time robotics, AR
Google Edge TPUASIC❌✅CommercialVoice/image inference
Syntiant NDPNeural DSP❌✅✅CommercialAudio, keyword spotting
Loihi 2Neuromorphic✅✅ExperimentalCognitive tasks, adaptive systems
IBM NorthPoleNeuromorphic✅✅ResearchGeneral edge AI


Final Thought:​

If Altman and Ive’s device is meant to operate continuously, privately, and intelligently at the edge without cloud dependence, then BrainChip Akida, GrAI VIP, or Loihi 2 would be top candidates—especially if adaptive, real-time learning is a core feature. If it’s more focused on inference-only tasks, Edge TPU or Syntiant NDPs could be practical alternatives.
Let me know if you'd like a recommendation based on a specific function (e.g. voice assistant, vision processing, real-time interaction).
4o

You said:​

yes please

ChatGPT said:​

Great — here's a tailored recommendation based on three likely functions that Sam Altman and Jony Ive’s device might emphasize, considering their vision of a screenless, wearable, real-time AI companion:

🔍 1. Voice Assistant / Natural Interaction (Always-On, Privacy-Preserving)

✅ Best Choice: Syntiant NDP120

  • Why:
    • Ultra-low-power (microwatt-level) operation
    • Always-listening capability without draining battery
    • Supports multiple wake words and audio events
  • Ideal for: Continuous voice interaction, like a whisper-quiet Siri that lives in your lapel
  • Bonus: Commercially proven in smart earbuds and hearing aids

Alt Option: BrainChip Akida

  • Slightly higher power but supports on-device learning — ideal if the device needs to learn your voice over time

👁️ 2. Vision / Environmental Awareness (Object, Gesture, Emotion Recognition)

✅ Best Choice: GrAI VIP or BrainChip Akida

  • GrAI VIP:
    • Optimized for real-time video inference
    • Sparse data-driven — processes only what matters
  • BrainChip Akida:
    • Can detect gestures, objects, and visual patterns with low power
    • Supports event-based cameras (mimics human eyes)

Alt Option: Mythic AMP

  • Analog compute-in-memory gives efficient vision processing but is newer and more complex to integrate

🧠 3. Real-Time Learning and Adaptation (Personalization, Emotion AI, Cognitive AI)

✅ Best Choice: Intel Loihi 2

  • Why:
    • True on-device learning and decision-making
    • Designed for systems that evolve over time based on interaction
    • Best for adaptive systems that change how they work depending on who’s using them

Alt Option: BrainChip Akida

  • Supports incremental learning on-chip (no need for cloud updates)

Final Recommendation Based on Use Case Focus:​

FunctionTop Tech PickRunner-Up
Voice assistant / audio AISyntiant NDP120BrainChip Akida
Environmental/visual awarenessGrAI VIPBrainChip Akida / Mythic AMP
Personalized cognitive assistantIntel Loihi 2BrainChip Akida

If the product blends all three functions — which seems likely — BrainChip Akida is the most versatile choice today due to its:
  • Commercial availability
  • On-chip learning
  • Ability to handle both audio and vision
 
Last edited:
  • Like
  • Fire
Reactions: 15 users

Guzzi62

Regular
Worth a listen imo. Not sure if prev posted but podcast from a couple of months ago talking to Tony, Jon, Rudy, Kris & Olivier.

A lot on TENNS in its workings and performances and some other snippets.




Yes it's worth listing to but 307 views since it was aired 2 months ago doesn't seem like a lot of interest, sadly.

EE Times isn't very popular it seems, just over 4k subscribers, likely computer freaks sitting in their mum's basement hacking whatever they can hack, LOL, and some BRN investors.

We need more exposure, much more.
 
  • Like
Reactions: 2 users

IloveLamp

Top 20

1000006052.jpg
 
  • Like
  • Love
  • Fire
Reactions: 32 users
The Space and Defence market is much smaller than commercial market.

It's not manipulation that keeping the share price down, it's our board who refuse to make announcements through the ASX about our own company. The board who won't do anything to keep current (and potential) shareholders informed.

The board who says the share price will do what the share price will do.

The blame lies squarely at their feet.
I forgot these 2 as well 😂
 
  • Haha
  • Like
  • Fire
Reactions: 6 users

Guzzi62

Regular
View attachment 85074

I am seriously beginning to wonder if this new AI device being developed by Sam Altman and Jony Ive could operate as an edge device, potentially incorporating neuromorphic processors like our very own Akida.

The upcoming AI device, expected to launch in 2026, is envisioned as a compact, screenless accessory designed to be unobtrusive and context-aware. It's intended to function alongside existing devices like smartphones and laptops, rather than replacing them. This suggests a focus on local processing capabilities. By processing data locally, the device could offer faster responses, enhanced privacy, and reduced reliance on cloud connectivity.

Have a listen to the video which I set to play when Sam Altman Altman explains how we currently interact with ChatGPT and he says something along the lines that the new device "deserves something much better" than having to wait for it to go up to the cloud like traditional devices.




As we all know, Akida's architecture allows for on-device learning and real-time data processing, making it ideal for applications requiring immediate responses without cloud dependency.

The specific technical details about the device are still under wraps, but the emphasis on unobtrusive design, real-time context awareness, and potential for local processing sounds like it aligns very well with the capabilities of BrainChip's Akida IMO.

Maybe just wishful thinking on my behalf. Time will tell.



View attachment 85075

So he paid app 6.5 billion for start-up io.

BRN is app 450 million (US?) MC currently, which is around 14 times less.

Hmm: 22cents X 14= $3 Aus

Hey Altman!! Altmaaaaaaan!!!!
 
  • Like
  • Fire
  • Haha
Reactions: 8 users
Well 18 months isn’t that far away unless your a dog or cat 😂
That's me being conservative, it could be longer, And I'm not on hot crapper
 
Top Bottom