Easy-and-worthwhile-to-read blog post by Innatera Strategic Business Development Manager Katarzyna ‘Kasia’ Kożdoń that will surely resonate with our staff at BrainChip:
This year, we spoke with industry leaders at some of the most important tech events around the globe, including CES and Sensors Converge in the USA, Mobile World Congress in both Spain and mainland China, Embedded World in Germany, and Computex in Taiwan. Across continents and markets, one thing sto
www.linkedin.com
Lessons from the floor: What global events reveal about the future of AI sensing
Katarzyna (Kasia) Kożdoń
Strategic Business Development Manager
August 5, 2025
This year, we spoke with industry leaders at some of the most important tech events around the globe, including CES and Sensors Converge in the USA, Mobile World Congress in both Spain and mainland China, Embedded World in Germany, and Computex in Taiwan. Across continents and markets, one thing stood out: everyone is talking about edge AI.
But while the term is everywhere, what people mean by it – and what they need from it – varies wildly. As edge AI gains popularity, its definition is expanding. We’re starting to see powerful “edge AI” solutions – even GPU-based “edge” computers – that deliver tremendous compute at an equally tremendous power draw. These systems are impressive, but they miss the original point: bringing intelligence closer to the source without ballooning energy costs or form factors.
At Innatera, we remain focused on redefining the edge – not by chasing bigger chips, but by taking intelligence to places it simply couldn’t go before. We're talking about AI that fits in the smallest corners of your world: always-on, ultra-efficient, intuitive, and ambient. Think AI that’s woven into the environment – not sitting on a desk with a fan – and adapts to application needs rapidly and with minimal user input requirements. It can anticipate which lights to turn on to suit your activity, enable headphones that instantly adapt to changing acoustic environments, and power smart bikes that provide real-time feedback on your posture and detect maintenance needs from subtle vibration patterns. It can even enable insect-scale autonomous drones that navigate and inspect infrastructure – no operator required.
The "move from cloud to edge" narrative has gained real traction. But what we heard on the floors of these events went deeper, revealing more specific needs and challenges at the device level.
The Edge AI realms
Talking with engineers and product leads across events, a pattern emerged – across different use cases, a number of bottlenecks arise – but one aspect remains consistent: traditional hardware isn't enough.
The ARM Upgraders
"We're already on the edge, but we're compute-starved"
These teams are already using microcontrollers or small SoCs. They’ve done the work to build on low-power ARM systems – but they’ve hit a wall. The problem isn’t moving to the edge; it’s what you can do once you’re there.
Their ask: more intelligence, faster reaction times, and richer features – all while staying within tight power and area budgets.
The Pi Downsizers
"We have the compute, but can't scale down."
These are the Raspberry Pi and Jetson Nano users. They’re used to development boards with plenty of headroom. But now, they want to productize. And suddenly, the size, cost, and power footprint of their go-to platforms just don’t fit.
Their challenge: replicating the intelligence they’ve built on development kits – but in a much smaller, more efficient, and deployable form factor.
The Always-On Seekers
"We need something that never sleeps."
These teams are often running on chips like Qualcomm Snapdragon or other heavy-duty SoCs. But they only need those systems to wake up when it’s necessary – for a voice command, a gesture, or an unusual event.
Right now, they’re duty cycling or using inefficient wake-up triggers. What they need is true always-on intelligence that doesn’t cost them battery life or generate heat.
The Sensor Hub Pioneers
"We want to connect everything, but there's no good solution."
This group is one of the most exciting – and underserved. They’re building products that rely on multiple sensor modalities: audio, motion, vision, and touch and others. But there’s no great, power-efficient chip that can act as a unified sensor hub and provide smart preprocessing or inference.
Every time we explain how neuromorphic technology can support multiple sensors, the conversation shifts. Eyes light up.
Why Neuromorphic Makes Sense – Finally
For years, neuromorphic computing was viewed as researchy, exotic, niche. That perception is shifting. These four user types are showing us exactly why neuromorphic is becoming practically relevant.
Here’s how we map:
- ARM Upgraders: We offer more compute, but within the same tight constraints.
- Pi Downsizers: We deliver significant improvements in power and area, with an acceptable trade-off in raw throughput.
- Always-On Seekers: We give you true always-on AI – not duty cycling hacks.
- Sensor Hub Pioneers: We provide the missing piece they didn’t know existed – smart sensor fusion on a single, ultra-efficient chip.
This is AI that starts with sensing. That reacts only when it needs to. That lives and breathes in the real world, not in data center specs.
The Ecosystem Reality Check
We also noticed a stark gap between the market hype and the actual challenges teams are facing:
- Many embedded teams lack deep AI experience. They know sensors and embedded systems, but they need approachable, adaptable AI.
- Data is hard to come by. Collecting and labeling real-world data for training takes time and resources most teams don’t have.
- Benchmarking novel hardware is non-trivial. Neuromorphic, Spiking Neural Networks (SNNs), non-von Neumann architectures – people are interested, but they don’t know how to compare them fairly (or how to integrate them into their current pipeline).
It is clear that innovation at the edge doesn’t happen in isolation. It requires collaboration across the entire stack. From sensor vendors, through solution providers, to infrastructure players – dataset providers, PCB makers, module builders, and distributors – each plays a vital role in bringing smarter edge systems to life.
At Innatera, we sit at the intersection of this network, enabling our partners to build intelligence where it previously wasn’t possible. Whether that means integrating neuromorphic compute into sensor hubs, powering always-on inference, or guiding teams through new AI workflows, we help close the gap between what’s needed and what’s technically achievable today. Together, we’re shaping the next wave of embedded intelligence – one practical step at a time.
Looking Ahead
The future of AI is ambient – systems that blend into their environment: efficient yet responsive, autonomous yet intuitive when engaged. There is so much potential when tapping into the data already around us. I’m excited to work with partners around the world to bring these kinds of solutions to market.
If you recognize your team in one of the categories above – if you’re compute-starved, power-burdened, or drowning in sensor complexity, or if you're building the missing ecosystem pieces that could help solve these challenges – we should talk and explore what we can create together.