CHIPS
Regular
View attachment 87795
You said:
Yesterday Tony Lewis, CTO of BrainChip commented "This is cool. Why? We literally took a model from a completely different domain (text processing) and applied it to RADAR and got excellent results in terms of performance and MAC (computation) count. At a recent conference we surveyed that state-of-the art and found that there is a huge opportunity for state-space models and was a real eye-opener to the hardware-algorithms systems we are pioneering at BrainChip Here is what I find cool: Conventional models take a radar signals, use a processor to convert the 1-d wave form into a 2d image so that conventional CNNs-- convolutional neural networks-- can process them. That inherently introducing lots of delay : signal-> image-> classification. With our State-space model (TENNs ) we can go directly from a raw signal to classification with much lower latency, few macs (less energy ) less chip area (cost) and excellent performance. This summer my team will push out a number of new models for our Defense, Medical and commercial work. Its an exciting time to be building the future at BrainChip. Follow me to hear the latest news on job posting, wins, and AI at the edge." Can you please explain what you think the major implications of this statement may be?
ChatGPT said:
Tony Lewis’s statement provides a significant glimpse into BrainChip’s current direction, and the implications are technically important, strategically aligned with military and commercial AI demands, and position BrainChip as a serious player in low-latency edge AI — especially in radar and defense systems.
Let’s break it down clearly:
Key Takeaways from Tony’s Statement
1. Cross-domain Model Transfer (Text ➝ Radar)
Implication:
BrainChip is leveraging neural architectures originally designed for natural language processing (NLP) — such as state-space models (SSMs) or possibly TENNs (Time-Encoded Neural Networks) — and applying them directly to raw radar data.
That’s a big deal — because traditionally, signal types (text, images, radar) are handled with very different model types. This kind of architectural crossover suggests:
It also hints that BrainChip is not just a hardware company, but deeply involved in cutting-edge neural architecture innovation.
- A unified AI model strategy
- Less model specialization required per domain
- Rapid model prototyping across applications
2. Bypassing 2D Transformation (Radar ➝ Image ➝ CNN)
Implication:
Traditional radar AI pipelines are inefficient:
This:
- Convert 1D waveform (Doppler/time signal) ➝ 2D spectrogram/image
- Then process with image-focused models like CNNs
BrainChip’s new method bypasses this entirely:
- Adds latency
- Requires additional memory
- Burns more power and compute
This is perfect for missiles, drones, wearables, and embedded defense systems where every milliwatt and microsecond counts.
- Processes raw radar signals directly using state-space models or TENNs
- Skips the image step — reducing delay, MAC count, power, and cost
3. Use of State-Space Models (SSMs / TENNs)
Implication:
State-space models (like S4, or BrainChip's custom TENNs) are:
By adopting SSMs:
- Efficient alternatives to Transformers for sequential data
- Excellent for long-range memory, which is vital in radar or ECG waveforms
- Linear in time and memory, unlike CNNs or RNNs
This also hints at a massive strategic shift:
- BrainChip is leading a hardware-algorithm co-design movement
- They’re building models that are explicitly optimized for their Akida hardware — not just running PyTorch models inefficiently on neuromorphic chips
- Moving away from retrofitting mainstream models
- Moving toward custom, edge-native models built with hardware constraints in mind from day one
4. Strategic Domains: Defense, Medical, Commercial
Implication:
BrainChip is:
By deploying domain-optimized TENNs, BrainChip is creating:
- Productizing their model-hardware synergy
- Targeting high-value edge verticals:
- Defense (radar, drones, loitering munitions, battlefield intelligence)
- Medical (wearables, EEG/ECG anomaly detection, hearing aids)
- Commercial (audio wake-word detection, industrial sensors, smart cameras)
- Application-specific IP
- A catalog of ready-to-run models for enterprise and defense clients
- A strategy that scales Akida beyond being "just a chip" into a full AI platform
5. Low Latency, Low MACs, Low Area = Edge Superiority
Implication:
This encapsulates the holy grail of edge AI:
And it's especially important for:
- Do more with less
- Enable true autonomy at the edge
- Outperform traditional AI methods in real-world environments
This reinforces BrainChip’s value proposition: the Akida platform delivers neural compute without the baggage of mainstream AI accelerators (like GPUs or TPUs).
- Missiles and UAVs, where thermal budget and compute are tightly constrained
- Medical implants or wearables, where energy efficiency is crucial
- Perimeter surveillance, where always-on, instant decision-making is required
Final Strategic Implications
BrainChip isn’t just delivering a chip — they’re co-developing algorithms tailored to that chip. That’s what Nvidia does with CUDA + GPUs — and it’s a winning strategy.
1. Shift Toward Model + Hardware Integration
This move positions BrainChip as:
2. Leadership in Neuromorphic AI for Real-world Deployment
- Not just neuromorphic researchers
- But practical AI infrastructure builders — solving latency, power, and cost in mission-critical systems
As their radar and medical models mature, BrainChip could:
3. Potential for IP Licensing or Defense-Grade Design Wins
- License these state-space models as embedded IP
- Land major contracts in defense, aerospace, or medical technology sectors
Summary
Tony Lewis’s comment reveals a strategic inflection point:
This isn't just about a smarter chip — it’s about building a new end-to-end paradigm for AI at the edge.
- BrainChip is pioneering hardware-native AI model design
- They’re breaking away from conventional pipelines (CNNs, 2D transforms)
- And positioning themselves at the forefront of edge-native, low-latency radar and signal intelligence
So funny ... before I read this this morning (in Germany), I was measuring my blood pressure. Result: 115/72
After I had read it, I measured again. Result: 131/75
It is too exciting for me what ChatGPT wrote



