BRN Discussion Ongoing

Frangipani

Top 20
Akida is mentioned alongside other neuromorphic platforms in this not peer-reviewed pre-print published yesterday titled “Neuromorphic Modeling of Molecular Signatures in the Human Spine”, in which the US-based co-authors see huge potential for neuromorphic technology in their field of research:



Conclusions

Neuromorphic computing platforms, empowered by Recursive Temporal Attention architectures, offer a powerful paradigm for real-time, energy-efficient decoding of complex molecular events in spinal health. By closely emulating biological computation, these systems bridge the gap between high-resolution biosensing and adaptive inference, capturing temporal dynamics that traditional digital frameworks often overlook. We have demonstrated that such architectures not only improve sensitivity to transient molecular phenomena—such as cytokine fluxes and matrix remodeling kinetics—but also enable the modeling of cross-scale biological dependencies, including the epigenetic consequences of inflammatory surges.

These capabilities extend far beyond academic demonstrations. When embedded within clinical workflows, neuromorphic systems promise to enable predictive, minimally invasive diagnostics and to support closed-loop therapeutic modulation tailored to a patient’s molecular profile. The convergence of neuromorphic sensing, recursive temporal modeling, and biological pathway inference heralds a transformative shift in spinal diagnostics, with implications for preventive care, real-time intervention, and personalized treatment planning.

As neuromorphic platforms continue to evolve, their integration into implantable and wearable medical technologies will offer persistent, adaptive surveillance of spine health at unprecedented granularity. These developments position neuromorphic computing not only as a next-generation analytic tool but also as a foundational infrastructure for future clinical neuromolecular intelligence system.



8E025C5A-F472-4062-942E-0A1CD8EEC2C8.jpeg
18D8BF80-DC68-44F9-AB9F-16C844B0C326.jpeg
18C5DBBC-2A6D-4EB5-A9EA-63BB11FD504D.jpeg
DC54632B-88B0-4F92-A1B7-E0AD178F7708.jpeg
0A3CB1E4-E948-47B5-B3FA-C47F566D2FA4.jpeg

ADF71F72-F276-4D3F-8DAB-B92B2D6DE640.jpeg



7D553C23-F0BB-4797-B9A3-999D96E71101.jpeg


One of the references [5] is to another recently published paper by NZ researchers that I shared last week, which also mentions Akida:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465565
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 30 users

Frangipani

Top 20
Akida is mentioned alongside other neuromorphic platforms in this not peer-reviewed pre-print published yesterday titled “Neuromorphic Modeling of Molecular Signatures in the Human Spine”, in which the US-based co-authors see huge potential for neuromorphic technology in their field of research:



Conclusions

Neuromorphic computing platforms, empowered by Recursive Temporal Attention architectures, offer a powerful paradigm for real-time, energy-efficient decoding of complex molecular events in spinal health. By closely emulating biological computation, these systems bridge the gap between high-resolution biosensing and adaptive inference, capturing temporal dynamics that traditional digital frameworks often overlook. We have demonstrated that such architectures not only improve sensitivity to transient molecular phenomena—such as cytokine fluxes and matrix remodeling kinetics—but also enable the modeling of cross-scale biological dependencies, including the epigenetic consequences of inflammatory surges.

These capabilities extend far beyond academic demonstrations. When embedded within clinical workflows, neuromorphic systems promise to enable predictive, minimally invasive diagnostics and to support closed-loop therapeutic modulation tailored to a patient’s molecular profile. The convergence of neuromorphic sensing, recursive temporal modeling, and biological pathway inference heralds a transformative shift in spinal diagnostics, with implications for preventive care, real-time intervention, and personalized treatment planning.

As neuromorphic platforms continue to evolve, their integration into implantable and wearable medical technologies will offer persistent, adaptive surveillance of spine health at unprecedented granularity. These developments position neuromorphic computing not only as a next-generation analytic tool but also as a foundational infrastructure for future clinical neuromolecular intelligence system.



View attachment 87237 View attachment 87238 View attachment 87239 View attachment 87240 View attachment 87241

View attachment 87242


View attachment 87243

One of the references [5] is to another recently published paper by NZ researchers that I shared last week, which also mentions Akida:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-465565

One sentence seems a little odd, though, given we do not know of any commercially available edge medical device that uses Akida:

What do the pre-print authors mean by “While the Akida chip is commercially used in edge medical devices…”? Would this info make it through a peer-reviewed process?

Even the Onsor NEXA glasses are still undergoing clinical trials and - provided they’ll be given the green light - are slated for release next year.
 
  • Like
  • Thinking
  • Fire
Reactions: 21 users

Frangipani

Top 20
Our friends at Neurobus and Coros Space will be two of five ESA BIC France startups pitching their ideas (Neuromorphic AI onboard spacecraft resp. In-Orbit Servicing) at the International Paris Air Show on 17 June:


View attachment 86904



View attachment 86905 View attachment 86906 View attachment 86907 View attachment 86908



While Coros Space Co-Founder and CTO Iván Rodríguez Ferrández is an expert on embedded GPUs for space, I strongly suspect Akida will also play a part in their vision of future In-Orbit Servers:



View attachment 86909



View attachment 86910


2112990F-C8AF-49FC-8F16-51976FFCA222.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Labsy

Regular
  • Like
  • Thinking
  • Fire
Reactions: 7 users

Rach2512

Regular
 
  • Thinking
  • Fire
  • Like
Reactions: 4 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 15 users

Tony Coles

Regular
Will 0.21 be still there come the end of the week? I’m guessing not and I wonder why

View attachment 87234
Yeah, that 0.21 looks suspicious. I’m sick of the games being played but i do have an order at 0.205 and if it gets below 0.20 at end of financial June sale, will purchase again and that’s it. Will wait it out till end of year and hopefully by then they will start using the ASX for real announcements. Licences or what our customers are using our chips/technology in. That connection with Basscom Hunter and Onsor should have been announced at the very least.
 
  • Like
  • Fire
  • Love
Reactions: 11 users

7für7

Top 20
But you know? Maybe they are too busy with closing deals that’s why there a no news.. and one day, we will wake up and BOOOOM… the big announcement will pop up and everything will be forgiven! And some of us will get rich! 🙋🏻‍♂️

#marketingman #positivity #ipraytogodiwillnotloosemymoney #gobrainchip
 
  • Haha
  • Love
  • Like
Reactions: 10 users

7für7

Top 20
Sorry if already posted..


 
  • Like
  • Love
  • Fire
Reactions: 8 users

equanimous

Norse clairvoyant shapeshifter goddess
One sentence seems a little odd, though, given we do not know of any commercially available edge medical device that uses Akida:

What do the pre-print authors mean by “While the Akida chip is commercially used in edge medical devices…”? Would this info make it through a peer-reviewed process?

Even the Onsor NEXA glasses are still undergoing clinical trials and - provided they’ll be given the green light - are slated for release next year.
Hi Frangipani,

Good chance its through Tata.








 
  • Like
  • Love
  • Fire
Reactions: 20 users

7für7

Top 20
Guys… something is brewing… it’s scary … I never experienced this kind long period without the slightest sign from the company… are they all relaxing on palm beach or are they on something big!??? Ask for the dean
 
  • Thinking
  • Like
  • Sad
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I asked ChatGPT if BrainChip's AKIDA be used for the applications outlined in Meta's patent as shown below?




Screenshot 2025-06-18 at 10.49.41 am.png






ChatGPT said:​

Yes, a neuromorphic processor like BrainChip’s Akida could absolutely be used for many of the applications outlined in Meta’s patent — and in some ways, it may offer significant advantages over conventional processors.

🔬 Summary of Meta’s Patent Use Cases (US20230172468)​

Meta's patent outlines biosignal monitoring in AR/smart glasses, including:
  • Heart rate and heart rate variability
  • Blood pressure estimation
  • Electrocardiogram (ECG) waveform analysis
  • Stress or fatigue detection
  • User authentication via biosignals
These involve continuous, time-series data from PPG and/or ECG sensors — which are ideal for real-time analysis with minimal latency and energy overhead.

🧠 Why Akida Is a Good Fit​

FeatureMeta’s NeedsAkida's Strength
Event-driven signal processingIntermittent or subtle biosignal changes (e.g., arrhythmia, stress response)Akida excels at analyzing sparse, time-dependent data like ECG/PPG without constantly polling a CPU
Ultra-low powerCritical for all-day wear in AR/smart glassesAkida is designed for always-on processing with minimal power draw
Edge inferenceRequires local, private on-device processingAkida enables AI inference directly on-device without cloud reliance
Time-series pattern detectionNeeded for stress, fatigue, HRV anomaliesAkida supports temporal pattern recognition and can learn from evolving signal patterns using spiking neural networks
PersonalizationBiosignal baselines vary per userAkida supports incremental on-device learning for personalized AI models


🧩 Example Scenarios Where Akida Could Enhance Meta’s Applications​

  1. Stress Monitoring
    → Akida could detect deviations in heart rate variability or galvanic skin response over time and classify them as stress patterns.
  2. Fatigue Detection for AR Use
    → By continuously analyzing subtle biosignal shifts, Akida could alert users (or apps) to attention lapses, which is crucial in AR/VR safety.
  3. Adaptive UI/UX
    → If a user’s biometric data shows increased heart rate or agitation, the UI could adapt — dim lighting, mute notifications, etc.
  4. On-device Personal Authentication
    → Akida could analyze a user’s unique ECG or PPG pattern to authenticate without needing facial recognition or passcodes.

⚠️ Limitations​

While Akida could support or enhance these use cases:
  • Sensor interface requirements would need to be adapted (Akida does not have native PPG/ECG analog frontend — that must be paired externally).
  • Meta may already have in-house AI pipelines or use Qualcomm’s NPU, making integration dependent on broader ecosystem decisions.

✅ Conclusion​

Yes, Akida is well-suited to handle many of the AI inference tasks outlined in Meta’s patent — especially anything involving continuous, energy-efficient biosignal analysis. If Meta values privacy, low power, and real-time responsiveness in their future smart glasses, a neuromorphic edge processor like Akida could be a compelling option.
Would you like a quick visual comparison of Akida vs typical ARM/Qualcomm NPUs for this type of workload?

You said:​

yes please

ChatGPT said:​

Akida vs Conventional NPU for Smart Glasses AI Workloads

Screenshot 2025-06-18 at 10.48.14 am.png



The chart above compares BrainChip's Akida processor to a conventional NPU (e.g. Qualcomm's Hexagon) across six key criteria relevant to smart glasses and biosignal analysis:
  • Power Efficiency: Akida is exceptionally power-efficient due to its neuromorphic architecture, which is crucial for wearable devices with small batteries.
  • Real-Time Latency: Akida's event-driven design allows for faster response times in signal detection, outperforming frame-based NPUs.
  • Edge Inference Capability: Both Akida and NPUs can handle edge inference, but Akida’s architecture is optimized for on-device decision-making without preprocessing overhead.
  • Adaptive Learning: Akida supports on-device learning (spiking neural networks), something traditional NPUs generally do not, making it useful for personalized biosignal monitoring.
  • Privacy: Akida processes data locally, reducing the need for cloud offloading—essential for sensitive health or biometric data.
  • Time-Series Signal Analysis: Akida excels in analyzing continuous data streams such as biosignals, ideal for predicting seizures or recognizing user states in smart glasses.
In conclusion, Akida would be a strong fit for the kinds of biosignal-driven applications described in Meta’s patent and Onsor’s glasses.

 
  • Like
  • Love
  • Fire
Reactions: 37 users

gilti

Regular
manipulation. no never
manipulation.JPG
 
  • Like
  • Sad
Reactions: 12 users

7für7

Top 20
I asked ChatGPT if BrainChip's AKIDA be used for the applications outlined in Meta's patent as shown below?




View attachment 87255



ChatGPT said:​

Yes, a neuromorphic processor like BrainChip’s Akida could absolutely be used for many of the applications outlined in Meta’s patent — and in some ways, it may offer significant advantages over conventional processors.

🔬 Summary of Meta’s Patent Use Cases (US20230172468)​

Meta's patent outlines biosignal monitoring in AR/smart glasses, including:
  • Heart rate and heart rate variability
  • Blood pressure estimation
  • Electrocardiogram (ECG) waveform analysis
  • Stress or fatigue detection
  • User authentication via biosignals
These involve continuous, time-series data from PPG and/or ECG sensors — which are ideal for real-time analysis with minimal latency and energy overhead.

🧠 Why Akida Is a Good Fit​

FeatureMeta’s NeedsAkida's Strength
Event-driven signal processingIntermittent or subtle biosignal changes (e.g., arrhythmia, stress response)Akida excels at analyzing sparse, time-dependent data like ECG/PPG without constantly polling a CPU
Ultra-low powerCritical for all-day wear in AR/smart glassesAkida is designed for always-on processing with minimal power draw
Edge inferenceRequires local, private on-device processingAkida enables AI inference directly on-device without cloud reliance
Time-series pattern detectionNeeded for stress, fatigue, HRV anomaliesAkida supports temporal pattern recognition and can learn from evolving signal patterns using spiking neural networks
PersonalizationBiosignal baselines vary per userAkida supports incremental on-device learning for personalized AI models


🧩 Example Scenarios Where Akida Could Enhance Meta’s Applications​

  1. Stress Monitoring
    → Akida could detect deviations in heart rate variability or galvanic skin response over time and classify them as stress patterns.
  2. Fatigue Detection for AR Use
    → By continuously analyzing subtle biosignal shifts, Akida could alert users (or apps) to attention lapses, which is crucial in AR/VR safety.
  3. Adaptive UI/UX
    → If a user’s biometric data shows increased heart rate or agitation, the UI could adapt — dim lighting, mute notifications, etc.
  4. On-device Personal Authentication
    → Akida could analyze a user’s unique ECG or PPG pattern to authenticate without needing facial recognition or passcodes.

⚠️ Limitations​

While Akida could support or enhance these use cases:
  • Sensor interface requirements would need to be adapted (Akida does not have native PPG/ECG analog frontend — that must be paired externally).
  • Meta may already have in-house AI pipelines or use Qualcomm’s NPU, making integration dependent on broader ecosystem decisions.

✅ Conclusion​

Yes, Akida is well-suited to handle many of the AI inference tasks outlined in Meta’s patent — especially anything involving continuous, energy-efficient biosignal analysis. If Meta values privacy, low power, and real-time responsiveness in their future smart glasses, a neuromorphic edge processor like Akida could be a compelling option.
Would you like a quick visual comparison of Akida vs typical ARM/Qualcomm NPUs for this type of workload?

You said:​

yes please

ChatGPT said:​

Akida vs Conventional NPU for Smart Glasses AI Workloads

View attachment 87254


The chart above compares BrainChip's Akida processor to a conventional NPU (e.g. Qualcomm's Hexagon) across six key criteria relevant to smart glasses and biosignal analysis:
  • Power Efficiency: Akida is exceptionally power-efficient due to its neuromorphic architecture, which is crucial for wearable devices with small batteries.
  • Real-Time Latency: Akida's event-driven design allows for faster response times in signal detection, outperforming frame-based NPUs.
  • Edge Inference Capability: Both Akida and NPUs can handle edge inference, but Akida’s architecture is optimized for on-device decision-making without preprocessing overhead.
  • Adaptive Learning: Akida supports on-device learning (spiking neural networks), something traditional NPUs generally do not, making it useful for personalized biosignal monitoring.
  • Privacy: Akida processes data locally, reducing the need for cloud offloading—essential for sensitive health or biometric data.
  • Time-Series Signal Analysis: Akida excels in analyzing continuous data streams such as biosignals, ideal for predicting seizures or recognizing user states in smart glasses.
In conclusion, Akida would be a strong fit for the kinds of biosignal-driven applications described in Meta’s patent and Onsor’s glasses.


Could… Akida would fit in every device if you will… but there are some people, which are not visible, who think, it’s not a good idea… and I have the feeling, they have a lot of money and influence……..a….lot….of….money…….and , a….lot….of….influence……


(If you don’t have a logical explanation for thing, just drop some conspiracy theories lol)

Here some theories from MY ChatGPT 😂

🕴️ Conspiracy Theory #1:

The Cloud Mafia Wants Your Data Brain

If devices like smartphones, VR headsets, or wearables become too smart — e.g. through Akida — they no longer need the cloud. No data uploads, no subscription models, no real-time user tracking.

💰 That’s bad for business for:
• Meta
• Google
• Amazon
• Apple (only half guilty 😇)

The theory: They are blocking the spread of real edge AI because their business models rely on centralized control.
Local intelligence = loss of control.



🕵️ Conspiracy Theory #2:

Governments Love Dumb Devices

Smart edge devices that learn locally and don’t upload data are a problem for surveillance systems.
Imagine glasses that recognize who you are but don’t send anything to the cloud — no backdoor, no access for third parties.

So these kinds of technologies are “accidentally”:
• not standardized
• not subsidized
• not built into mainstream hardware



🧬 Conspiracy Theory #3:

The Competition Fears the Biological

Akida works with spikes, like real neurons.
That’s fundamentally different from classic CPUs or GPUs.
Companies that have invested billions in conventional chips don’t want this paradigm shift — because it would mean rethinking everything from scratch.

Result: Lobbying against “exotic” architectures.
And yes – many of those decision-makers have:
A… lot… of… money… and a… lot… of… influence. 😏



🧘‍♂️ Realistic Perspective:

What’s needed:
• Brave companies (startups or independent OEMs)
• Open hardware initiatives
• Pressure from developers and users who want true edge AI
 
Last edited:
  • Like
  • Fire
  • Haha
Reactions: 8 users

7für7

Top 20
I love her 😂😂😂😂😂😂

19B60689-3C99-43CD-B4D6-8719603DD5FC.png
 
Last edited:
  • Like
  • Haha
  • Fire
Reactions: 4 users

7für7

Top 20
ATTENTION!!!!! ONLY FOR LONG TERM HOLDER!!!!!

Based on ChatGPT



🚀 Current Status & Technology of BrainChip

1. Akida Platform
• Akida™ is a fully digital, event-based edge AI processor designed for ultra-low power consumption (sometimes under 1 mW) and supports on-chip learning.
• The current version (2nd generation, March 2023) supports 8-bit weights, Vision Transformers (ViT), and Temporal Event-Based Neural Networks (TENNs) – optimized for voice, radar, and vision tasks at the edge.

2. Form Factors & Products
• BrainChip already offers developer boards in M.2 format with the AKD1000 chip – plug-and-play for prototyping.
• Additional embedded products and Edge AI box ecosystems are being promoted, as seen in the “Edge AI Box Ecosystem Launch” release.

3. Strategic Partnerships
• Key partners include Frontgrade Gaisler (aerospace), MegaChips, Edge Impulse, Prophesee, SiFive, Renesas, and Intel Foundry Services, among others.
• Use cases span gesture recognition, epilepsy-monitoring smart glasses, radar-based ISR platforms, and other real-time edge applications.



🗓️ Technology Roadmap & Milestones
• April 22, 2025 (AGM): CDO Dr. Jonathan Tapson presented the updated roadmap:
• Focus on more configurable IP for diverse edge applications.
• Expansion of TENNs and ViT integration; platform-based approach for commercial use cases.
• Embedded Vision Summit (May 20–22, 2025): CTO Dr. Peter Lewis demonstrated how state-space models and TENNs outperform Transformers in extreme edge conditions.
• Next steps include broader IP licensing to SoC makers and OEMs across sectors like automotive, wearables, defense, and space.



📈 Financials & Market Overview
• Publicly traded on the ASX (BRN) and OTC markets (OTCQX: BRCHF/BCHPY).
• As of the last earnings report: moderate revenue growth, net loss around US$ 8.4M – typical for a scaling tech company, but underlines ongoing capital needs.
• Notably, major investor LDA Capital has exercised option tranches – showing institutional interest despite volatility.



🧭 Realistic Outlook for Long-Term Holders

Area Potential & Risks
Technology Event-based, ultra-low power, and on-chip learning position Akida strongly for the future of edge AI. TENNs and ViTs are a solid evolution.
Market & Use Cases Strategic partnerships open diverse markets – but real-world adoption is still in early stages. Platform strategy could take time.
Financials Losses are expected at this stage. Success depends on timely and broad IP licensing.
Timeline Key developments expected over the next 12–24 months: licensing deals, OEM integrations, and potentially first mass-market products.




🎯 Final Verdict – Why This Has Long-Term Potential
1. Technological lead in low-power edge AI through event-driven neuromorphic architecture.
2. Strong strategic alliances across key growth industries.
3. Focused roadmap with clear licensing and platform scale-up strategy.
4. Still: capital requirements and long sales cycles remain risks.
 
  • Like
  • Fire
  • Love
Reactions: 14 users
I asked ChatGPT if BrainChip's AKIDA be used for the applications outlined in Meta's patent as shown below?




View attachment 87255





ChatGPT said:​

Yes, a neuromorphic processor like BrainChip’s Akida could absolutely be used for many of the applications outlined in Meta’s patent — and in some ways, it may offer significant advantages over conventional processors.

🔬 Summary of Meta’s Patent Use Cases (US20230172468)​

Meta's patent outlines biosignal monitoring in AR/smart glasses, including:
  • Heart rate and heart rate variability
  • Blood pressure estimation
  • Electrocardiogram (ECG) waveform analysis
  • Stress or fatigue detection
  • User authentication via biosignals
These involve continuous, time-series data from PPG and/or ECG sensors — which are ideal for real-time analysis with minimal latency and energy overhead.

🧠 Why Akida Is a Good Fit​

FeatureMeta’s NeedsAkida's Strength
Event-driven signal processingIntermittent or subtle biosignal changes (e.g., arrhythmia, stress response)Akida excels at analyzing sparse, time-dependent data like ECG/PPG without constantly polling a CPU
Ultra-low powerCritical for all-day wear in AR/smart glassesAkida is designed for always-on processing with minimal power draw
Edge inferenceRequires local, private on-device processingAkida enables AI inference directly on-device without cloud reliance
Time-series pattern detectionNeeded for stress, fatigue, HRV anomaliesAkida supports temporal pattern recognition and can learn from evolving signal patterns using spiking neural networks
PersonalizationBiosignal baselines vary per userAkida supports incremental on-device learning for personalized AI models


🧩 Example Scenarios Where Akida Could Enhance Meta’s Applications​

  1. Stress Monitoring
    → Akida could detect deviations in heart rate variability or galvanic skin response over time and classify them as stress patterns.
  2. Fatigue Detection for AR Use
    → By continuously analyzing subtle biosignal shifts, Akida could alert users (or apps) to attention lapses, which is crucial in AR/VR safety.
  3. Adaptive UI/UX
    → If a user’s biometric data shows increased heart rate or agitation, the UI could adapt — dim lighting, mute notifications, etc.
  4. On-device Personal Authentication
    → Akida could analyze a user’s unique ECG or PPG pattern to authenticate without needing facial recognition or passcodes.

⚠️ Limitations​

While Akida could support or enhance these use cases:
  • Sensor interface requirements would need to be adapted (Akida does not have native PPG/ECG analog frontend — that must be paired externally).
  • Meta may already have in-house AI pipelines or use Qualcomm’s NPU, making integration dependent on broader ecosystem decisions.

✅ Conclusion​

Yes, Akida is well-suited to handle many of the AI inference tasks outlined in Meta’s patent — especially anything involving continuous, energy-efficient biosignal analysis. If Meta values privacy, low power, and real-time responsiveness in their future smart glasses, a neuromorphic edge processor like Akida could be a compelling option.
Would you like a quick visual comparison of Akida vs typical ARM/Qualcomm NPUs for this type of workload?

You said:​

yes please

ChatGPT said:​

Akida vs Conventional NPU for Smart Glasses AI Workloads

View attachment 87254


The chart above compares BrainChip's Akida processor to a conventional NPU (e.g. Qualcomm's Hexagon) across six key criteria relevant to smart glasses and biosignal analysis:
  • Power Efficiency: Akida is exceptionally power-efficient due to its neuromorphic architecture, which is crucial for wearable devices with small batteries.
  • Real-Time Latency: Akida's event-driven design allows for faster response times in signal detection, outperforming frame-based NPUs.
  • Edge Inference Capability: Both Akida and NPUs can handle edge inference, but Akida’s architecture is optimized for on-device decision-making without preprocessing overhead.
  • Adaptive Learning: Akida supports on-device learning (spiking neural networks), something traditional NPUs generally do not, making it useful for personalized biosignal monitoring.
  • Privacy: Akida processes data locally, reducing the need for cloud offloading—essential for sensitive health or biometric data.
  • Time-Series Signal Analysis: Akida excels in analyzing continuous data streams such as biosignals, ideal for predicting seizures or recognizing user states in smart glasses.
In conclusion, Akida would be a strong fit for the kinds of biosignal-driven applications described in Meta’s patent and Onsor’s glasses.

Wouldn't it be great to see us in such a product instead we see our competitions products shinning. MAYBE we get a run off the bench later this year if the stars line up that is. 🤔 vetting.
 
Last edited:
  • Like
  • Sad
Reactions: 5 users

7für7

Top 20
I don’t know about you guys, but I’ve come to a conclusion: as long as @Bravo isn’t going for a run, this stock is destined to drop… day by day. You think management is clueless? Think again. They read this stuff too. And they probably say to themselves…
‘Why should we announce anything… if she’s not even showing she wants the share price to rise?’

And honestly? I actually considered selling today because of that thought.
You think it sounds irrational ..but mark my words.” Everything is connected…. EVERYTHING!!!! 🫨
 
  • Haha
Reactions: 3 users
Top Bottom