Legend, thanks Tech.Good morning from another beautiful day in Perth...30c already at 8am
I'm not too sure that, the above quote is 100% accurate, other companies' chips with our IP embedded would be a lot more
accurate.
We simply don't supply chips, IP in blocks is how I understand it to be moving forward, I also understand what she is implying and maybe I'm being a little pedantic.
And for Santa's little helpers still shaking our Christmas Tree, the only thing falling off is fluff, which we don't deal in anymore.
Our tree will never fall over no matter how much shaking you give it, why, because our foundations are rock solid.
See you on the other side of CES 2023, I believe that my neighbour has a meeting arranged with the Brainchip team in Las Vegas to discuss the possibly of having her engineers in the South African mining industry work in with our guys to develop Akida technology for underground mining in the areas of gas sensing, predictive maintenance, vibration analysis etc.
I'll ask her to take some photos if possible while at CES.
Tech x![]()
OKAY thanks for being there yet again and yes at least on my reading Transformers go hand in hand with LSTM.Remember that, like sgt. Schultz, I know nothing about Transformers.
As I understand it:
Update akida_models to 1.1.8
- Transformers pretrained models updated to 4-bits
means the model libraries (speech was the initial application of Transformers) have been updated.
There are about 44 phonemes in English, but dialect and accent will multiply this.
The updating was to convert the libraries to 4-bit bytes (from ... 8-bits?) so they are compatible with Akida NPUs.
This is only changing some numbers in memory - no silicon was harmed in performing this update.
Remembering that we are selling Akida IP, implementing Transformers in Akida may be as simple as updating the model library and designing the appropriate NPU configuration, the configuration being implemented by the ARM Cortex M MPU as part of the set up procedure.
Now I suppose you do need LSTM for transformers, so this will require some additional memory for previous words/sentences, so there may be some collateral damage to the silicon at this stage.
So I wonder if we have had to go back to the drawingboard after implementing LSTM to update the silicon for Transformers which are a very recent phenomenon for Akida.
We could sign a deal with the biggest company ever and the shorters would still short the crap out of it, the names mentioned this share shouldn't be at this price
So if Intel need us let's hope where going to charge them the MaxIts a very delicate situation.
Imagine your supervisors and bosses Investers spent billions to develop a less superior product!
You don't just go oh we were wrong and let's go this way. It takes a plan and time to steer that giant vessel a different direction. You know how many jobs and manager roles and friends would have been booted with a sharp reaction. It's a big political game also.
Know Intel will make money and be in the chip race for edge AI. It's only time that the rest join.
I know we would love to hear them sing AKIDA but they won't not yet.
This partnership was the last DD I needed to really feel confident in BRN succeeding. As Arm, Merc, Renesas, Mega Chips and more weren't
This is a sign how quick others will adopt.
IMO
Merry Christmas Happy Holidays folks.
Great post FF, much appreciated by everyone here.One reason I can think of is if Mike Davies/Intel says this loud and clear it is a signal to every other company that they don’t need Intel and can go straight to Brainchip.
At the moment they can continue to try and control the narrative that Loihi is the way to neuromorphic ascendancy and Brainchip AKIDA IP is a niche chip for limited use cases and if you really want it they can supply it.
But Brainchip is building itself into as many ecosystems as it possibly can in much the same way as a virus moves through a community.
Renesas is tapping out at the low end while at the same time Edge Impulse is comparing AKIDA with GPUs.
Peter van der Made has stated that the market has no understanding of the significance of MegaChips to the future success of Brainchip.
Prophesee and the use cases it referenced for an AKIDA Prophesee event based intelligent sensor picks up many of the industry directions referenced in the report I posted above.
At the same time ARM the chip supply monster is promoting AKIDA across virtually every industrial use case.
NASA is clearly exploring the use of AKIDA as an essential element of deep space exploration and DARPA is deeply imbedding the AKIDA technology in radar and other use cases via ISL and others.
Then Biotome is one of the known medical research companies exploring the use of AKIDA for this industry.
Mercedes Benz extolling the AKIDA advantage over all current technology options in the automotive industry.
Remember Alex the Rocket Scientist stating that he refers to autonomous vehicles as robots because that is what they are technically. Mercedes Benz has not simply extolled the benefits of using AKIDA for cars needing voice recognition but for every single robotic use case from drones to personal assistants.
Carnegie Mellon University and others are now teaching AKIDA Science Fiction to the next generation of technology entrepreneurs and innovators who will populated the research labs and offices of the technology giants.
Brainchip’s Board and Management are brilliantly in my opinion creating an environment where AKIDA is ubiquitous and if you are not involved in some way then you are not on the right technology page.
In my opinion Intel had no choice but to join and will try to control an uncontrollable narrative which has Renesas offering AKIDA for making low end MCU’s smart and Edge Impulse describing AKIDA as a threat to GPUs and the stuff of Science Fiction.
The eventual release of AKIDA next gen into the established and growing ecosystem will be like hitting the nitrous switch on a dragster.
My opinion only DYOR
FF
AKIDA BALLISTA
Love any dot joining and happy to analyse and review new revelations..but..a link to a Japanese dog breed? Farfetched, methinks. Isaac's sacrifice in Hebrew is 'Akedat Yitzchak'..as he was nearly 'spiked' in sacrifice.Surely not? View attachment 25199
View attachment 25223
Keep pulling everyone. We’re nearly there.
Can’t wait till the earth gives way and we get our giant carrot!
Merry Christmas to all x
Hi @Fact FinderYes the timetable was always for the IP to be released second half 2022 and if you have a look at Sean Hehir's AGM presentation he states it is going to happen and that a reference chip is likely to be produced next year.
I do not want to prejudice your interpretation of his exact words but they left me wondering if the IP release was more flexible than before 31.12.22 as I had thought from statements by Peter van der Made, Anil Mankar and Ken Scarince it was a definite for 2022 until then.
Certainly they have purchased and paid for all the third party IP they required as this was reported in the half yearly report and clarified by Ken Scarince that the IP purchase covered the AKIDA 2.0 third party IP.
I know you know how to access the AGM presentation so I would be interested to hear your take on how to understand Sean Hehir on the release date for the IP.
If it is not released before 31.12.22 I would suspect it is a tactical decision related to a customer engagement as the IP was ready end of last year and it is the engineering that has been taking place this year. Going by AKD1000 the engineering clearly was not finalised when the IP was released to select customers back in 2019 and when it was made more generally available they were still working on the FPGA to ensure they could get the production of AKD1000 right the first time. Which thanks to Anil Mankar's brilliance in this area they did.
My opinion only DYOR
FF
AKIDA BALLISTA
Wish you all a Merry Christmas and a good start into the new year! May we see next year a better market situation than this year! Stay healthy and happy![]()
Best
7
200 million cameras at $10.00 each is a 2 billion dollar market but highly likely much larger as some cameras will be much more expensive but just one percent of a 2 billion dollar market is 20 million dollars.I found this article interesting and talks about the market depth for our partnership with prophesee and MagikEye (not directly).
There is also a big list at the bottom of a multitude of fields Brainchip could be included in.
Enjoy
Smart Machine Vision Comes to the Edge, With Close to 200 Million Cameras to be Deployed by 2027
Smart machine vision is on the job in factories, warehouses, and shipping centers, and ripe for development in smart cities, smart healthcare, and smart transportation
22 Dec 2022
Machine Vision (MV) uses technology that enables industrial machines to “see” and analyze tasks and make rapid decisions based on what the system sees. MV is fast becoming one of the most central technologies in automation. Given that now this technology is merging with Machine Learning (ML) to lead the transition to Industry 4.0, the possibilities are enormous, especially at the edge. According to global technology intelligence firm ABI Research, forecasts that by 2027, total shipments of camera systems will reach 197 million, with revenue of US$35 billion.
“The shift from machines that can automate simple tasks to autonomous machines that can “see” to optimize elements for extended periods will drive new levels of industrial innovation. This is the innovation that ML offers to MV (also often known as computer vision). ML can augment classic machine vision algorithms by employing the range and reach of neural network models, thus expanding machine vision far beyond visual inspection and quality control, the locus classicus of good, old-fashioned computer vision,” explains David Lobina, Artificial Intelligence and Machine Learning Analyst at ABI Research.
Of all the trends in the ML market, at the edge of computing has the most exciting applications and benefits – namely, in those devices that are part of embedded systems and the Internet of Things.
Smart manufacturing is perhaps the most straightforward case, where smart cameras, embedded sensors, and powerful computers can bring ML analyses to every process step. Smart machine vision is on the job in factories, warehouses, and shipping centers, aiding and assisting human workers by handling the more mundane tasks, freeing workers to use their expertise to focus on the essential parts.
The market is also ripe for development in smart cities, smart healthcare, and smart transportation, with ATOS (in cities), Arcturus(in healthcare), and Netradyne (in transportation) as some of the key vendors in these sectors.
As in other cases of edge ML applications, the best way for the technology to advance is through a combination of hardware and software solutions and employing information-rich data. It is through a holistic approach of how all these factors can merge and combine that will achieve fruitful results. Vendors are aware that they need to provide a competitive product. In cases involving sensitive or private data, such as healthcare, a whole package should provide hardware (cameras, chips, etc.), software, and an excellent way to analyze the data. The “whole package” approach is perhaps not the most common example in the market. Still, vendors must be increasingly aware of how their offerings can mesh with other solutions, often requiring hardware-agnostic software and software-agnostic data analysis. “This is a crucial point in the case of smart cities, healthcare, and transportation, especially regarding what machine vision can achieve in all these settings. For edge MV, software and hardware vendors, as well as service providers, will start taking an expansive view of the sector,” Lobina concludes.
These findings are from ABI Research’s Edge ML-Based Machine Vision Software and Services application analysis report. This report is part of the company’s Artificial Intelligence and Machine Learning research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present an in-depth analysis of key market trends and factors for a specific technology.
About ABI Research
ABI Research is a global technology intelligence firm delivering actionable research and strategic guidance to technology leaders, innovators, and decision makers around the world. Our research focuses on the transformative technologies that are dramatically reshaping industries, economies, and workforces today.
ABI Research提供开创性的研究和战略指导,帮助客户了解日新月异的技术。 自1990年以来,我们已与全球数百个领先的技术品牌,尖端公司,具有远见的政府机构以及创新的贸易团体建立了合作关系。 我们帮助客户创造真实的业务成果。
For more information about ABI Research’s services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific, or visit www.abiresearch.com.
Contact ABI Research
Media Contacts
Americas: +1.516.624.2542
Europe: +44.(0).203.326.0142
Asia: +65 6950.5670
Related Research
![]()
Edge ML-Based Machine Vision Software and Services
Research Report | 4Q 2022 | AN-4957
Related Service
![]()
AI & Machine Learning
![]()
![]()
![]()
![]()
RESEARCH SERVICES
5G
5G Core & Edge Networks
5G Devices, Smartphones & Wearables
5G Markets
5G & Mobile Network Infrastructure
CYBER & DIGITAL SECURITY
Citizen Digital Identity
Cybersecurity Applications
Digital Payment Technologies
Industrial Cybersecurity
IoT Cybersecurity
Telco Cybersecurity
Trusted Device Solutions
INDUSTRIAL & MANUFACTURING
Industrial, Collaborative & Commercial Robotics
Industrial & Manufacturing Markets
Industrial & Manufacturing Technologies
IOT
IoT Hardware & Devices
IoT Markets
IoT Networks & Services
AI & Machine Learning
Augmented & Virtual Reality
Consumer Technologies
Distributed & Edge Computing
Location Technologies
Metaverse Markets & Technologies
Satellite Communications
Smart Homes & Buildings
Smart Mobility & Automotive
Smart Urban Infrastructure
Supply Chain Management & Logistics
Sustainable Technologies
Wi-Fi, Bluetooth & Wireless Connectivity
![]()
Smart machine vision is on the job in factories, warehouses, and shipping centers, and ripe for development in smart cities, healthcare, and transportation. | Edge Impulse (a Qualcomm company)
Smart machine vision is on the job in factories, warehouses, and shipping centers, and ripe for development in smart cities, healthcare, and transportation. "Smart manufacturing is perhaps the most straightforward case, where smart cameras, embedded sensors, and powerful computers can bring ML...www.linkedin.com