BRN Discussion Ongoing

Rach2512

Regular
Thoughts?

 
  • Like
  • Thinking
Reactions: 3 users

Diogenese

Top 20
Thoughts?

Hi Rach,

I don't think the possum is in the tree Jesus is barking up.

He's labouring under a misconception that Akida's "spikes" are analog and not normal binary digital bytes. The prototype Akida 1 used a single binary bit to represent a spike, but it was not an analog spike. It was a binary digital bit. The commercial version upped the "spike" to 4 binary digital bits. Akida does not need ADC (analog to digital converter) to talk to associated digital equipment. In fact Akida has a number of external digital interfaces adapted to utilize different digital communication protocols without using ADC

An ADC converts an analog voltage signal to digital format by producing a digital byte whose value is proportional to the size of the spike voltage. Manufacturing variability and temperature instability make this an unreliable process when thousands of ADC operations are involved.

Akida 3/GenAI will move away from an internal packet transport mesh to a solid state switched transport mesh because this gets rid of the packet header overload (to which Jesus has directed his efforts) and the associated bit switching operations.
 
Last edited:
  • Like
  • Love
  • Wow
Reactions: 16 users

GStocks123

Regular
First time I’ve read of a partner testing TENNs.

 
  • Fire
  • Like
  • Love
Reactions: 18 users

7für7

Top 20
First time I’ve read of a partner testing TENNs.

A bit older, but once again like I mentioned earlier … kinda sh*t. They don’t name the thing for what it is … for whatever “reasons”…
BrainChip or Akida isn’t mentioned anywhere, but the clues are pretty obvious in my opinion: TENNs, event-based processing, ultra-low-power vision, the PCIe/RPi edge setup, and even an ARM context between the lines.

Outsiders just read “neuromorphic hardware”… insiders recognize the fingerprint.

Braini… just my opinion.
 
  • Like
Reactions: 8 users
A bit older, but once again like I mentioned earlier … kinda sh*t. They don’t name the thing for what it is … for whatever “reasons”…
BrainChip or Akida isn’t mentioned anywhere, but the clues are pretty obvious in my opinion: TENNs, event-based processing, ultra-low-power vision, the PCIe/RPi edge setup, and even an ARM context between the lines.

Outsiders just read “neuromorphic hardware”… insiders recognize the fingerprint.

Braini… just my opinion.
Multiform Blog from December mentions TENNs.

https://multicorewareinc.com/designing-ultra-low-power-vision-pipelines-on-neuromorphic-hardware/

Designing for Neuromorphic Hardware

The neuromorphic AI framework we had identified utilized a novel type of neural networks Temporal Event-based Neural Networks (TENNs). TENNs employs a state-space architecture that processes events dynamically rather than at fixed intervals, skipping idle periods to minimize energy and memory usage. This design enables real-time inference on milliwatts of power, making it ideal for edge deployments.

SC
 
  • Like
Reactions: 14 users

TheDrooben

Pretty Pretty Pretty Pretty Good

Screenshot_20260217_074351_LinkedIn.jpg




Happy as Larry
 
  • Like
  • Love
  • Fire
Reactions: 15 users
Hi Rach,

I don't think the possum is in the tree Jesus is barking up.

He's labouring under a misconception that Akida's "spikes" are analog and not normal binary digital bytes. The prototype Akida 1 used a single binary bit to represent a spike, but it was not an analog spike. It was a binary digital bit. The commercial version upped the "spike" to 4 binary digital bits. Akida does not need ADC (analog to digital converter) to talk to associated digital equipment. In fact Akida has a number of external digital interfaces adapted to utilize different digital communication protocols without using ADC

An ADC converts an analog voltage signal to digital format by producing a digital byte whose value is proportional to the size of the spike voltage. Manufacturing variability and temperature instability make this an unreliable process when thousands of ADC operations are involved.

Akida 3/GenAI will move away from an internal packet transport mesh to a solid state switched transport mesh because this gets rid of the packet header overload (to which Jesus has directed his efforts) and the associated bit switching operations.
I couldn't help myself to get this information to him and his response is below.
 

Attachments

  • Screenshot_20260217_071049_LinkedIn.jpg
    Screenshot_20260217_071049_LinkedIn.jpg
    276.4 KB · Views: 60
  • Like
  • Fire
  • Love
Reactions: 4 users

Labsy

Regular
My thoughts are that we are coming along nicely. Hang on chippers. Nearly there... I'm heavily in the red as my break even is 46cents and can't afford to wiggle out now aside from the fact that I value our stock much higher than my break even let alone where it is sitting now. I'm hoping for a pleasant surprise this year and am more than willing to hold on tight for another 12 months. Let's see how we go. 🙏
$1 party later this year?? 🎉
 
  • Like
  • Love
Reactions: 11 users

View attachment 95167



Happy as Larry



curlednoodles

Keycloak is a security system, more specifically an Identity & Access Management (IAM) platform.

In practical terms, it’s the software that decides:

who you are (authentication — login, MFA, biometrics, etc.)

what you’re allowed to access (authorization — roles, permissions)

It’s widely used as the front door to applications, APIs, and enterprise systems — handling SSO, tokens (OAuth/OpenID), and zero-trust workflows. It’s maintained by Red Hat (part of IBM), but it’s used globally well beyond IBM environments.

So when Kevin says “voice authorization on Symphony/Akida with Keycloak,” he’s talking about plugging Akida into real authentication infrastructure — not just doing signal processing, but participating directly in security decisions
 
  • Like
  • Fire
  • Love
Reactions: 11 users

jtardif999

Regular
Since yesterday's post I've been doing a bit more research into Loihi 3 and discovered a very long and detailed blog on Loihi 3 from someone called Dr Shayan Erfanian (see link below).

The profile material I found presents Dr Erfanian as a technology strategist and software/cybersecurity entrepreneur, not obviously a primary neuromorphic researcher ,which I think pushes the blog more toward forecasting rather than authoritative disclosure.

For what it's worth, Dr Erfanian says that we can expect to hear more about Loihi 3 at the Intel Developer Conference in Q2/3 2026.

In another expert he states "the 2-3 year horizon following Loihi 3's launch (2028-2029) will witness substantial industry restructuring as the energy efficiency benefits of neuromorphic computing become widely acknowledged and integrated."






Excerpt 1

View attachment 95145



Excerpt 2

View attachment 95146



All of these ruminations about the emergence of Loihi 3 would seem to align with Intel's job advertising published 16 days ago for an AI Software Architect Neuromorhpic Computing.

The job ad says "Now, we're entering an exciting new chapter: transforming these breakthroughs into real-world products that will power the coming era of physical AI systems beyond the reach of GPUs and mainstream AI accelerators." It also states a key responsibility is to "Integrate neuromorphic software into leading robotics, IoT, and sensing frameworks to enable broad ecosystem adoption."



Intel's Job Ad

The other thing that struck me yesterday about Mike Davies LinkedIn Post 8 months ago on Loihi 3 was how similar it sounded to Akida 2 with regard to features such as LLM's (see below).



View attachment 95147



I realise that the addressable market is big enough for more than 1 player, but Intel is a behemoth of a player and therefore a pretty significant threat.

I can’t deny feeling quite disappointed that what once looked like a meaningful lead has now narrowed.

In hindsight, relying solely on an IP-only strategy appears to have seriously limited commercial traction. The pivot to physical chips AKD1500 (and now AKD2500) feels, at least to me, like an acknowledgment that licensing alone wasn’t able to convert at the pace required.

Having said that, I can understand the appeal of the IP-only idea because it's capital-light (no wafer commitments, inventory risk, or hardware support burden). But in practice, IP-only seems to work best when there’s already a mature ecosystem, strong reference implementations and customers ready to integrate with confidence. Neuromorphic as a new and disruptive technology would make pure IP much harder to monetise.

The key question now isn’t whether the pivot was necessary but whether it was too late. The risk is that competitors with much deeper resources and broader ecosystems may be able to close the window further.

In the coming months I'll be keeping a very close eye on any primary sources from Intel announcing Loihi 3 specs.

Great post but keep this in mind, Intel currently has only one Neuromorphic patent. I find this surprising and maybe @Diogenese can correct me there - AFAICT we are streets ahead in the patent department.
 
  • Like
  • Love
Reactions: 3 users
Top Bottom