GazDix
Regular
I found time to sit down and read the latest whitepaper. After finishing, it jumped out to me that the actual use-cases they mentioned for Akida I believe are already implemented. Here are some of the highlights:
After digging deep in the history of AI it mentions: 'Neuromorphic edge AI silicon is already enabling people to seamlessly interact with smarter devices'. 'Already' is an interesting word for this sentence. I am imagining lots of testing happening of Akida. The use of the plural and broad term: 'people' is done on purpose. Hints at consumers with my rose-tinted coloured glasses on, but of course could mean only three CTOs from start-ups as well.
It concluded: 'These include cars that personalize cabin settings for individual drivers, smart farms, automated factories and warehouses, advanced speech and facial recognition applications, and robots that use sophisticated sensors to see, hear, smell, touch, and taste'. Now because of MB and EQXX, Brainchip are happy to use the EQXX use cases as examples and the other use cases we already know about. There have been many other hypothetical use-cases they and we have thrown around, but it seems these specific ones are chosen. Why? I think because they are already in use and not in the development stage.
This one was really specific: 'Field hospitals in disaster zones can deploy medical robots with advanced edge AI capabilities'. This intrigues me. Where? I wonder.
I am excited for the near future. In the latest podcast as well, Accenture's CTO and Hehir were saying Edge AI will take off 2023/2024. This is now!
Quarterly could be out anytime now (AXE just released theirs). Not expecting much. We have our Annual Report to be released as well in February (judging by previous years). Getting a little bit of FOMO these might be the last days we can buy at such low prices. Exciting times.
Cheers all,
After digging deep in the history of AI it mentions: 'Neuromorphic edge AI silicon is already enabling people to seamlessly interact with smarter devices'. 'Already' is an interesting word for this sentence. I am imagining lots of testing happening of Akida. The use of the plural and broad term: 'people' is done on purpose. Hints at consumers with my rose-tinted coloured glasses on, but of course could mean only three CTOs from start-ups as well.
It concluded: 'These include cars that personalize cabin settings for individual drivers, smart farms, automated factories and warehouses, advanced speech and facial recognition applications, and robots that use sophisticated sensors to see, hear, smell, touch, and taste'. Now because of MB and EQXX, Brainchip are happy to use the EQXX use cases as examples and the other use cases we already know about. There have been many other hypothetical use-cases they and we have thrown around, but it seems these specific ones are chosen. Why? I think because they are already in use and not in the development stage.
This one was really specific: 'Field hospitals in disaster zones can deploy medical robots with advanced edge AI capabilities'. This intrigues me. Where? I wonder.
I am excited for the near future. In the latest podcast as well, Accenture's CTO and Hehir were saying Edge AI will take off 2023/2024. This is now!
Quarterly could be out anytime now (AXE just released theirs). Not expecting much. We have our Annual Report to be released as well in February (judging by previous years). Getting a little bit of FOMO these might be the last days we can buy at such low prices. Exciting times.
Cheers all,