BRN Discussion Ongoing

1671777342500.jpeg
 
  • Haha
  • Like
Reactions: 26 users

Foxdog

Regular
Good morning from another beautiful day in Perth...30c already at 8am

I'm not too sure that, the above quote is 100% accurate, other companies' chips with our IP embedded would be a lot more
accurate.

We simply don't supply chips, IP in blocks is how I understand it to be moving forward, I also understand what she is implying and maybe I'm being a little pedantic.

And for Santa's little helpers still shaking our Christmas Tree, the only thing falling off is fluff, which we don't deal in anymore.
Our tree will never fall over no matter how much shaking you give it, why, because our foundations are rock solid.

See you on the other side of CES 2023, I believe that my neighbour has a meeting arranged with the Brainchip team in Las Vegas to discuss the possibly of having her engineers in the South African mining industry work in with our guys to develop Akida technology for underground mining in the areas of gas sensing, predictive maintenance, vibration analysis etc.

I'll ask her to take some photos if possible while at CES.

Tech x (y)🎅
Legend, thanks Tech.
Merry Christmas to you and all yours and to all Chippers and all theirs 🥰
 
  • Like
  • Love
  • Fire
Reactions: 16 users
Remember that, like sgt. Schultz, I know nothing about Transformers.

As I understand it:

Update akida_models to 1.1.8​

  • Transformers pretrained models updated to 4-bits


means the model libraries (speech was the initial application of Transformers) have been updated.

There are about 44 phonemes in English, but dialect and accent will multiply this.

The updating was to convert the libraries to 4-bit bytes (from ... 8-bits?) so they are compatible with Akida NPUs.

This is only changing some numbers in memory - no silicon was harmed in performing this update.

Remembering that we are selling Akida IP, implementing Transformers in Akida may be as simple as updating the model library and designing the appropriate NPU configuration, the configuration being implemented by the ARM Cortex M MPU as part of the set up procedure.

Now I suppose you do need LSTM for transformers, so this will require some additional memory for previous words/sentences, so there may be some collateral damage to the silicon at this stage.

So I wonder if we have had to go back to the drawingboard after implementing LSTM to update the silicon for Transformers which are a very recent phenomenon for Akida.
OKAY thanks for being there yet again and yes at least on my reading Transformers go hand in hand with LSTM.

At the 2021 Ai Field Day Anil Mankar said words close to we are working on a version with a little bit of LSTM which when you look at the product development graphic LSTM as you have noted from time to time was to be an additional feature for AKD1500.

Just like AKD500 for white goods AKD1500 has not been disclosed but there is no point in adding the Transformers if you don’t have anywhere to use them.

Akin to selling CDs before you release the CD player. It makes no sense.

So as the little pieces of information are released it is clear that under the surface a great deal is going on that will in due course benefit the company and shareholders.

“Like a duck on the pond. On the surface everything looks calm, but beneath the water those little feet are churning a mile a minute.”​

Gene Hackman

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 35 users
Resistance is futile in the end :LOL:

63a5519ee5ec3258826386.gif
 
  • Haha
  • Like
  • Love
Reactions: 42 users
Resistance is futile in the end :LOL:

View attachment 25221
We could sign a deal with the biggest company ever and the shorters would still short the crap out of it, the names mentioned this share shouldn't be at this price
 
Its a very delicate situation.

Imagine your supervisors and bosses Investers spent billions to develop a less superior product!

You don't just go oh we were wrong and let's go this way. It takes a plan and time to steer that giant vessel a different direction. You know how many jobs and manager roles and friends would have been booted with a sharp reaction. It's a big political game also.

Know Intel will make money and be in the chip race for edge AI. It's only time that the rest join.

I know we would love to hear them sing AKIDA but they won't not yet.

This partnership was the last DD I needed to really feel confident in BRN succeeding. As Arm, Merc, Renesas, Mega Chips and more weren't 🙄

This is a sign how quick others will adopt.
IMO

Merry Christmas Happy Holidays folks.
So if Intel need us let's hope where going to charge them the Max
 

Dhm

Regular
One reason I can think of is if Mike Davies/Intel says this loud and clear it is a signal to every other company that they don’t need Intel and can go straight to Brainchip.

At the moment they can continue to try and control the narrative that Loihi is the way to neuromorphic ascendancy and Brainchip AKIDA IP is a niche chip for limited use cases and if you really want it they can supply it.

But Brainchip is building itself into as many ecosystems as it possibly can in much the same way as a virus moves through a community.

Renesas is tapping out at the low end while at the same time Edge Impulse is comparing AKIDA with GPUs.

Peter van der Made has stated that the market has no understanding of the significance of MegaChips to the future success of Brainchip.

Prophesee and the use cases it referenced for an AKIDA Prophesee event based intelligent sensor picks up many of the industry directions referenced in the report I posted above.

At the same time ARM the chip supply monster is promoting AKIDA across virtually every industrial use case.

NASA is clearly exploring the use of AKIDA as an essential element of deep space exploration and DARPA is deeply imbedding the AKIDA technology in radar and other use cases via ISL and others.

Then Biotome is one of the known medical research companies exploring the use of AKIDA for this industry.

Mercedes Benz extolling the AKIDA advantage over all current technology options in the automotive industry.

Remember Alex the Rocket Scientist stating that he refers to autonomous vehicles as robots because that is what they are technically. Mercedes Benz has not simply extolled the benefits of using AKIDA for cars needing voice recognition but for every single robotic use case from drones to personal assistants.

Carnegie Mellon University and others are now teaching AKIDA Science Fiction to the next generation of technology entrepreneurs and innovators who will populated the research labs and offices of the technology giants.

Brainchip’s Board and Management are brilliantly in my opinion creating an environment where AKIDA is ubiquitous and if you are not involved in some way then you are not on the right technology page.

In my opinion Intel had no choice but to join and will try to control an uncontrollable narrative which has Renesas offering AKIDA for making low end MCU’s smart and Edge Impulse describing AKIDA as a threat to GPUs and the stuff of Science Fiction.

The eventual release of AKIDA next gen into the established and growing ecosystem will be like hitting the nitrous switch on a dragster.

My opinion only DYOR
FF

AKIDA BALLISTA
Great post FF, much appreciated by everyone here.
 
  • Like
  • Love
  • Fire
Reactions: 16 users
A science related article that shareholders may find interesting:

To summarise, it was recently discovered that the spinal cord and brainstem process sensory data before it is sent to the brain. It was already known that other body parts like the retina do this, but it was always presumed that the signals such as touch were passed from organs like the skin directly to the brain.

In my view, this has implications for companies wanting to most effectively mimic the neuromorphic behaviour of human bodies and achieve results with ultra low latencies and power. Essentially, more Akida chips may be required.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

misslou

Founding Member
DB0E7D2F-1801-42C8-A99D-FD6232BD08B1.jpeg


Keep pulling everyone. We’re nearly there.
Can’t wait till the earth gives way and we get our giant carrot!

Merry Christmas to all x
 
  • Like
  • Love
  • Fire
Reactions: 58 users

Adam

Regular
Love any dot joining and happy to analyse and review new revelations..but..a link to a Japanese dog breed? Farfetched, methinks. Isaac's sacrifice in Hebrew is 'Akedat Yitzchak'..as he was nearly 'spiked' in sacrifice.
Akeeda you not. Maybe tis a link to a new phone..we live in hope.
All the best for the new year,
Akeedat heart.
 
  • Like
  • Haha
Reactions: 9 users

HopalongPetrovski

I'm Spartacus!
  • Haha
  • Like
  • Love
Reactions: 9 users

Boab

I wish I could paint like Vincent
Hi @Diogenese
or anyone else that may know. Have the pending patents (in bold print) been approved?
I remember the brains trust here highlighting the importance of getting the IP rights to the JAST learning rule.

Enjoy the break and Merry Christmas to all.

October 13, 2022 7:25 AM
BrainChip Fortifies Neuromorphic Patent Portfolio with New Awards and IP Acquisition
LAGUNA HILLS, CA / ACCESSWIRE / October 12, 2022 / BrainChip Holdings Ltd (ASX BRN)(OTCQX:BRCHF)(ADR:BCHPY), the world's first commercial producer of ultra-low power neuromorphic AI IP, has extended the breadth and depth of its neuromorphic IP with two new patents granted by the US Patents and Trademarks Office (USPTO), and the acquisition of previously licensed technology from Toulouse Tech Transfer (TTT). These latest additions of technical assets reinforce BrainChip's event-based processor differentiation for high performance, ultra-low power AI inference and on-chip learning.

The latest patents awarded to BrainChip from the USPTO include:

US 11,468,299 "An Improved Spiking Neural Network," protects the learning function of BrainChip's digital neuron circuit implemented on a neuromorphic integrated circuit/system (e.g., AkidaTM).
US 11,429,857, "Secure Voice Communications System," protects a system to establish secure voice communications between a local and a remote neural network device. Information is encrypted by transmitting spike timing rather than original data, rendering it useless to anyone intercepting the transmission.
BrainChip also acquired full ownership of the IP rights related to JAST learning rule and algorithms from French technology transfer-based company TTT, including issued patent EP3324344 and pending patents US2019/0286944 and EP3324343. The invention related to the acquired IP rights include pattern detection algorithms that provide BrainChip with significant competitive advantages. The company held an exclusive license for the IP prior to their acquisition.

BrainChip considers patents to be valuable IP assets that help the company preserve its global competitive advantages. Its patent portfolio now comprises 10 US, 1 European and 1 Chinese issued patents. In addition, some 29 patent applications are pending in the US, Europe, Australia, Canada, Japan, Korea, India, Brazil, Russia, Mexico, and Israel.

"The foundational neuromorphic patents we've received for AI and ML strengthen our IP product and commercial differentiation," said Sean Hehir, Chief Executive Officer at BrainChip. "Our continued leadership in neuromorphic research and architectural design continues to be rewarded as our patent portfolio grows."
 
  • Like
  • Love
  • Fire
Reactions: 13 users

McHale

Regular
Yes the timetable was always for the IP to be released second half 2022 and if you have a look at Sean Hehir's AGM presentation he states it is going to happen and that a reference chip is likely to be produced next year.

I do not want to prejudice your interpretation of his exact words but they left me wondering if the IP release was more flexible than before 31.12.22 as I had thought from statements by Peter van der Made, Anil Mankar and Ken Scarince it was a definite for 2022 until then.

Certainly they have purchased and paid for all the third party IP they required as this was reported in the half yearly report and clarified by Ken Scarince that the IP purchase covered the AKIDA 2.0 third party IP.

I know you know how to access the AGM presentation so I would be interested to hear your take on how to understand Sean Hehir on the release date for the IP.

If it is not released before 31.12.22 I would suspect it is a tactical decision related to a customer engagement as the IP was ready end of last year and it is the engineering that has been taking place this year. Going by AKD1000 the engineering clearly was not finalised when the IP was released to select customers back in 2019 and when it was made more generally available they were still working on the FPGA to ensure they could get the production of AKD1000 right the first time. Which thanks to Anil Mankar's brilliance in this area they did.

My opinion only DYOR
FF

AKIDA BALLISTA
Hi @Fact Finder

"For the short term, you can expect another major release of next generation IP and another reference chip prior to the next AGM".
This Is verbatim what Sean said in his presentation.

However that's all I could find in the AGM Pres re Akida 2, so from what I can ascertain there are no clear timelines, other than "for the short term" - another major release of next gen IP; and the other part being another reference chip prior to next AGM.

What is a short term - maybe the CES since we are still in that short term mode, how long is a piece of string ? But CES would be very nice given all the public reveals about to be made there, and by my measurement of the collective will and anticipation here, that outcome would be a suitable unfoldment type scenario, and a happy New Year. Then we get to see the Akida 2 chip in the intervening space before May, if Sean delivers on his forecasts. Who would know, as the Zen folk say: it is, it is, it is and so on.

I know it's taken me a while to get back with this, but I still run a business and have been flat as a lizard (it is it is it is), I am positively izzed at the moment, my short bursts on here are selective skim readings, which I do regularly just to try and keep across what's going on in the BRNsphere.

Something I did glean from reading the AGM again, was that I found a deeper appreciation of what Antonio Viana brings to the table, BRN have been making impressive hirings, Antonio comes across real good to me.

So I wish all here a happy Christmas, and a great New Year, which will hopefully really deliver on the promise which is so proficiently and generously pieced together by the wonderful posters here; and of course the collective genius behind those at Brainchipinc. Thank you all so much.
mc.
 
  • Like
  • Love
  • Fire
Reactions: 64 users
Drunk Santa Claus GIF
Wish you all a Merry Christmas and a good start into the new year! May we see next year a better market situation than this year! Stay healthy and happy

Best
7


Yeah looks like Santa couldn't deliver this year and he's not too happy about it either..

But a New Year is almost upon us and the fruits of the BrainChip team's labour, will begin to be shown to all.

And the World will take notice.
 
  • Like
  • Love
  • Fire
Reactions: 26 users

Sirod69

bavarian girl ;-)
TVO-Drive: Development progress towards autonomous driving

Autonomous driving is getting closer


Sensors, radar, cameras: new cars are increasingly full of innovative technology. This trend is far from over. Driver assistance systems will continue to gain in importance. Autonomous driving is getting closer and closer. The automotive location of Bavaria 🥰😘🥳 is also researching the cars of tomorrow, for example at the French supplier Valeo.
In Kronach, the company operates an important center for the development and testing of autonomous driving.


 
  • Like
  • Fire
  • Love
Reactions: 23 users

Makeme 2020

Regular
Socionext promoting BRAINCHIP at the upcoming CES
Provider of industry-leading networking & imaging technologies & solutions

Socionext America, Inc.



Home > Spotlights > Leading-Edge Automotive Custom SoC Solutions at CES 2023

Leading-Edge Automotive Custom SoC Solutions at CES 2023​

Featuring advanced technologies in automotive display, AI, and smart sensing​

CES-2023-hero

Overview​

Socionext showcases its automotive custom SoC technologies at the CES 2023 Vehicle Tech and Advanced Mobility Zone, located in the Las Vegas Convention Center, North Hall, Booth 10654. CES runs from January 5-8, 2023.
Socionext’s advanced custom solutions are designed to help OEMs and tier-one automakers achieve complete ownership of key differentiating technologies with an added competitive edge.
These custom SoCs enable a wide range of applications, including ADAS sensors, central computing, networking, in-cabin monitoring, satellite connectivity, and infotainment.
Socionext’s Automotive Custom SoC Solutions and Services
Socionext’s Automotive Custom SoC Solutions and Services

Product Spotlight​

CES2023_Spotlight-Preview

Electronic Products Article
Download

Videos​

Featured Technologies and Solutions​

CES2023_display

Smart Display Controller for Automotive Remote Display​

Socionext's highly integrated ISO26262-certified SC1721/ SC1722/ SC1723 Series Graphics Display Controllers feature built-in safety mechanisms that enable error-free, safety-critical content to meet the safety standards required by today’s multi-display cockpit applications.

Collateral​

CES2023_radar

Low Power 60GHz RF CMOS Radar Sensors with Embedded Sensing Engine and Built-in Antenna for In-Cabin Monitoring​

Socionext has created a variety of smart RADAR solutions, including 24GHz and 60GHz, and has developed a roadmap showcasing future technologies.
The Socionext 60GHz RF CMOS sensor has features and capabilities to support multiple in-cabin uses, including seat occupancy monitoring, child presence detection, and theft prevention

Videos​

Collateral​

CES2023_ai

Advanced AI Solutions for Automotive​

CES2023_Brainchip-Logo

Socionext has partnered with artificial intelligence provider BrainChip to develop optimized, intelligent sensor data solutions based on Brainchip’s Akida® processor IP.
BrainChip’s flexible AI processing fabric IP delivers neuromorphic, event-based computation, enabling ultimate performance while minimizing silicon footprint and power consumption. Sensor data can be analyzed in real-time with distributed, high-performance and low-power edge inferencing, resulting in improved response time and reduced energy consumption.

Videos & Links​

CES 2023 Visitor Registration Form​

Stay in touch with Socionext!

  • First Name*

  • Last Name*

  • Company Name

  • Title/Role

  • Location*
    AfghanistanAlbaniaAlgeriaAmerican SamoaAndorraAngolaAntigua and BarbudaArgentinaArmeniaAustraliaAustriaAzerbaijanBahamasBahrainBangladeshBarbadosBelarusBelgiumBelizeBeninBermudaBhutanBoliviaBosnia and HerzegovinaBotswanaBrazilBruneiBulgariaBurkina FasoBurundiCambodiaCameroonCanadaCape VerdeCayman IslandsCentral African RepublicChadChileChinaColombiaComorosCongo, Democratic Republic of theCongo, Republic of theCosta RicaCôte d'IvoireCroatiaCubaCuraçaoCyprusCzech RepublicDenmarkDjiboutiDominicaDominican RepublicEast TimorEcuadorEgyptEl SalvadorEquatorial GuineaEritreaEstoniaEthiopiaFaroe IslandsFijiFinlandFranceFrench PolynesiaGabonGambiaGeorgiaGermanyGhanaGreeceGreenlandGrenadaGuamGuatemalaGuineaGuinea-BissauGuyanaHaitiHondurasHong KongHungaryIcelandIndiaIndonesiaIranIraqIrelandIsraelItalyJamaicaJapanJordanKazakhstanKenyaKiribatiNorth KoreaSouth KoreaKosovoKuwaitKyrgyzstanLaosLatviaLebanonLesothoLiberiaLibyaLiechtensteinLithuaniaLuxembourgMacedoniaMadagascarMalawiMalaysiaMaldivesMaliMaltaMarshall IslandsMauritaniaMauritiusMexicoMicronesiaMoldovaMonacoMongoliaMontenegroMoroccoMozambiqueMyanmarNamibiaNauruNepalNetherlandsNew ZealandNicaraguaNigerNigeriaNorthern Mariana IslandsNorwayOmanPakistanPalauPalestine, State ofPanamaPapua New GuineaParaguayPeruPhilippinesPolandPortugalPuerto RicoQatarRomaniaRussiaRwandaSaint Kitts and NevisSaint LuciaSaint Vincent and the GrenadinesSamoaSan MarinoSao Tome and PrincipeSaudi ArabiaSenegalSerbiaSeychellesSierra LeoneSingaporeSint MaartenSlovakiaSloveniaSolomon IslandsSomaliaSouth AfricaSpainSri LankaSudanSudan, SouthSurinameSwazilandSwedenSwitzerlandSyriaTaiwanTajikistanTanzaniaThailandTogoTongaTrinidad and TobagoTunisiaTurkeyTurkmenistanTuvaluUgandaUkraineUnited Arab EmiratesUnited KingdomUnited StatesUruguayUzbekistanVanuatuVatican CityVenezuelaVietnamVirgin Islands, BritishVirgin Islands, U.S.YemenZambiaZimbabwe
    United States

  • Email*

  • Phone

  • Your Message*
 
  • Like
  • Fire
  • Love
Reactions: 55 users
I found this article interesting and talks about the market depth for our partnership with prophesee and MagikEye (not directly).

There is also a big list at the bottom of a multitude of fields Brainchip could be included in.

Enjoy 😀

Smart Machine Vision Comes to the Edge, With Close to 200 Million Cameras to be Deployed by 2027​




Smart machine vision is on the job in factories, warehouses, and shipping centers, and ripe for development in smart cities, smart healthcare, and smart transportation

22 Dec 2022
Machine Vision (MV) uses technology that enables industrial machines to “see” and analyze tasks and make rapid decisions based on what the system sees. MV is fast becoming one of the most central technologies in automation. Given that now this technology is merging with Machine Learning (ML) to lead the transition to Industry 4.0, the possibilities are enormous, especially at the edge. According to global technology intelligence firm ABI Research, forecasts that by 2027, total shipments of camera systems will reach 197 million, with revenue of US$35 billion.

“The shift from machines that can automate simple tasks to autonomous machines that can “see” to optimize elements for extended periods will drive new levels of industrial innovation. This is the innovation that ML offers to MV (also often known as computer vision). ML can augment classic machine vision algorithms by employing the range and reach of neural network models, thus expanding machine vision far beyond visual inspection and quality control, the locus classicus of good, old-fashioned computer vision,” explains David Lobina, Artificial Intelligence and Machine Learning Analyst at ABI Research.
Of all the trends in the ML market, at the edge of computing has the most exciting applications and benefits – namely, in those devices that are part of embedded systems and the Internet of Things.

Smart manufacturing is perhaps the most straightforward case, where smart cameras, embedded sensors, and powerful computers can bring ML analyses to every process step. Smart machine vision is on the job in factories, warehouses, and shipping centers, aiding and assisting human workers by handling the more mundane tasks, freeing workers to use their expertise to focus on the essential parts.

The market is also ripe for development in smart cities, smart healthcare, and smart transportation, with ATOS (in cities), Arcturus(in healthcare), and Netradyne (in transportation) as some of the key vendors in these sectors.

As in other cases of edge ML applications, the best way for the technology to advance is through a combination of hardware and software solutions and employing information-rich data. It is through a holistic approach of how all these factors can merge and combine that will achieve fruitful results. Vendors are aware that they need to provide a competitive product. In cases involving sensitive or private data, such as healthcare, a whole package should provide hardware (cameras, chips, etc.), software, and an excellent way to analyze the data. The “whole package” approach is perhaps not the most common example in the market. Still, vendors must be increasingly aware of how their offerings can mesh with other solutions, often requiring hardware-agnostic software and software-agnostic data analysis. “This is a crucial point in the case of smart cities, healthcare, and transportation, especially regarding what machine vision can achieve in all these settings. For edge MV, software and hardware vendors, as well as service providers, will start taking an expansive view of the sector,” Lobina concludes.
These findings are from ABI Research’s Edge ML-Based Machine Vision Software and Services application analysis report. This report is part of the company’s Artificial Intelligence and Machine Learning research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present an in-depth analysis of key market trends and factors for a specific technology.
About ABI Research
ABI Research is a global technology intelligence firm delivering actionable research and strategic guidance to technology leaders, innovators, and decision makers around the world. Our research focuses on the transformative technologies that are dramatically reshaping industries, economies, and workforces today.
ABI Research提供开创性的研究和战略指导,帮助客户了解日新月异的技术。 自1990年以来,我们已与全球数百个领先的技术品牌,尖端公司,具有远见的政府机构以及创新的贸易团体建立了合作关系。 我们帮助客户创造真实的业务成果。
For more information about ABI Research’s services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific, or visit www.abiresearch.com.

Contact ABI Research​

Media Contacts​

Americas: +1.516.624.2542
Europe: +44.(0).203.326.0142
Asia: +65 6950.5670

Related Research​

Edge ML-Based Machine Vision Software and Services

Edge ML-Based Machine Vision Software and Services​

Research Report | 4Q 2022 | AN-4957

Related Service​

AI & Machine Learning

AI & Machine Learning​



twitter

youtube

linkedin

wechat
RESEARCH SERVICES

5G​

5G Core & Edge Networks

5G Devices, Smartphones & Wearables

5G Markets

5G & Mobile Network Infrastructure

CYBER & DIGITAL SECURITY​

Citizen Digital Identity

Cybersecurity Applications

Digital Payment Technologies

Industrial Cybersecurity

IoT Cybersecurity

Telco Cybersecurity

Trusted Device Solutions

INDUSTRIAL & MANUFACTURING​

Industrial, Collaborative & Commercial Robotics

Industrial & Manufacturing Markets

Industrial & Manufacturing Technologies

IOT​

IoT Hardware & Devices

IoT Markets

IoT Networks & Services


AI & Machine Learning

Augmented & Virtual Reality

Consumer Technologies

Distributed & Edge Computing

Location Technologies

Metaverse Markets & Technologies

Satellite Communications

Smart Homes & Buildings

Smart Mobility & Automotive

Smart Urban Infrastructure

Supply Chain Management & Logistics

Sustainable Technologies

Wi-Fi, Bluetooth & Wireless Connectivity


 
  • Like
  • Fire
  • Love
Reactions: 29 users
The following paper is a good read and while it does not mention the name of the spike provider AKIDA is part of the SCaN program and is also linked to cognitive communications in other NASA proposals.

The Brainchip Patent approved mid this year allowing for encrypted communications between AKIDA chips also fits nicely in what is to be a closed secure communications network. The decision not to announce this Patent also fits with it being of interest to NASA.

Also for @MC🐠 there is a mention of his Amazon Web Services.

There are so many gossamer threads to the spider webs that are being revealed in all the connections that it seems inevitable that success has become entwined captured and devoured by the AKIDA Science Fiction Beast:

“Applying the Cognitive Space Gateway to Swarm Topologies

Abstract—NASA’s future vision for interplanetary networking includes a lunar network, Cube Satellite (CubeSat) constellations, and deep space robotic missions, comprising what could be viewed as a network of networks. Delay-tolerant networking (DTN) architecture and protocols provide a standard network layer among these varying scenarios and mitigate many chal- lenges of the space environment, such as long delays, unplanned service interruptions, and asymmetric links. The Cognitive Space Gateway (CSG) is a routing method in a DTN architecture that uses spiking neural networks as the learning element to optimize routing decisions in a complex environment.
This work aims to further develop cognitive networking technologies in several critical areas, including DTN, the CSG algorithm, SmallSat swarm topologies, and cloud services. The CSG algorithm is tested in a realistic scenario in which the emulated network topology is based on a SmallSat swarm. The emulation environment will be built upon a commercial cloud service, such as Amazon Web Services (AWS) Elastic Compute Cloud. This work investigates the ability of such a platform to enable a flexible, lower maintenance approach to creating a multi- hop network outside of a physical laboratory. The cloud platform will provide a secure environment allowing for collaboration among government and academic entities.
Index Terms—Cognitive Networking, CubeSat Swarms, Rout- ing, Network Modeling, Cloud Computing”


Merry Christmas to all visionary Brainchip shareholders. 🎄🎄🎄

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 40 users
Katana mentioned a few times by posters previously

The video below sounds like they are directly describing Akida IP?

“Runs on battery - potentially for years”
“Analyses (the video) locally on the chip… it’s not streaming it off into the cloud for analysis”
“It’s power efficient to only stream that content when there’s something meaningful to be seen”


1671834093129.png




 
  • Like
  • Fire
  • Love
Reactions: 33 users
I found this article interesting and talks about the market depth for our partnership with prophesee and MagikEye (not directly).

There is also a big list at the bottom of a multitude of fields Brainchip could be included in.

Enjoy 😀

Smart Machine Vision Comes to the Edge, With Close to 200 Million Cameras to be Deployed by 2027​




Smart machine vision is on the job in factories, warehouses, and shipping centers, and ripe for development in smart cities, smart healthcare, and smart transportation

22 Dec 2022
Machine Vision (MV) uses technology that enables industrial machines to “see” and analyze tasks and make rapid decisions based on what the system sees. MV is fast becoming one of the most central technologies in automation. Given that now this technology is merging with Machine Learning (ML) to lead the transition to Industry 4.0, the possibilities are enormous, especially at the edge. According to global technology intelligence firm ABI Research, forecasts that by 2027, total shipments of camera systems will reach 197 million, with revenue of US$35 billion.

“The shift from machines that can automate simple tasks to autonomous machines that can “see” to optimize elements for extended periods will drive new levels of industrial innovation. This is the innovation that ML offers to MV (also often known as computer vision). ML can augment classic machine vision algorithms by employing the range and reach of neural network models, thus expanding machine vision far beyond visual inspection and quality control, the locus classicus of good, old-fashioned computer vision,” explains David Lobina, Artificial Intelligence and Machine Learning Analyst at ABI Research.
Of all the trends in the ML market, at the edge of computing has the most exciting applications and benefits – namely, in those devices that are part of embedded systems and the Internet of Things.

Smart manufacturing is perhaps the most straightforward case, where smart cameras, embedded sensors, and powerful computers can bring ML analyses to every process step. Smart machine vision is on the job in factories, warehouses, and shipping centers, aiding and assisting human workers by handling the more mundane tasks, freeing workers to use their expertise to focus on the essential parts.

The market is also ripe for development in smart cities, smart healthcare, and smart transportation, with ATOS (in cities), Arcturus(in healthcare), and Netradyne (in transportation) as some of the key vendors in these sectors.

As in other cases of edge ML applications, the best way for the technology to advance is through a combination of hardware and software solutions and employing information-rich data. It is through a holistic approach of how all these factors can merge and combine that will achieve fruitful results. Vendors are aware that they need to provide a competitive product. In cases involving sensitive or private data, such as healthcare, a whole package should provide hardware (cameras, chips, etc.), software, and an excellent way to analyze the data. The “whole package” approach is perhaps not the most common example in the market. Still, vendors must be increasingly aware of how their offerings can mesh with other solutions, often requiring hardware-agnostic software and software-agnostic data analysis. “This is a crucial point in the case of smart cities, healthcare, and transportation, especially regarding what machine vision can achieve in all these settings. For edge MV, software and hardware vendors, as well as service providers, will start taking an expansive view of the sector,” Lobina concludes.
These findings are from ABI Research’s Edge ML-Based Machine Vision Software and Services application analysis report. This report is part of the company’s Artificial Intelligence and Machine Learning research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present an in-depth analysis of key market trends and factors for a specific technology.
About ABI Research
ABI Research is a global technology intelligence firm delivering actionable research and strategic guidance to technology leaders, innovators, and decision makers around the world. Our research focuses on the transformative technologies that are dramatically reshaping industries, economies, and workforces today.
ABI Research提供开创性的研究和战略指导,帮助客户了解日新月异的技术。 自1990年以来,我们已与全球数百个领先的技术品牌,尖端公司,具有远见的政府机构以及创新的贸易团体建立了合作关系。 我们帮助客户创造真实的业务成果。
For more information about ABI Research’s services, contact us at +1.516.624.2500 in the Americas, +44.203.326.0140 in Europe, +65.6592.0290 in Asia-Pacific, or visit www.abiresearch.com.

Contact ABI Research​

Media Contacts​

Americas: +1.516.624.2542
Europe: +44.(0).203.326.0142
Asia: +65 6950.5670

Related Research​

Edge ML-Based Machine Vision Software and Services

Edge ML-Based Machine Vision Software and Services

Research Report | 4Q 2022 | AN-4957

Related Service​

AI & Machine Learning

AI & Machine Learning


twitter

youtube

linkedin

wechat
RESEARCH SERVICES

5G​

5G Core & Edge Networks

5G Devices, Smartphones & Wearables

5G Markets

5G & Mobile Network Infrastructure

CYBER & DIGITAL SECURITY​

Citizen Digital Identity

Cybersecurity Applications

Digital Payment Technologies

Industrial Cybersecurity

IoT Cybersecurity

Telco Cybersecurity

Trusted Device Solutions

INDUSTRIAL & MANUFACTURING​

Industrial, Collaborative & Commercial Robotics

Industrial & Manufacturing Markets

Industrial & Manufacturing Technologies

IOT​

IoT Hardware & Devices

IoT Markets

IoT Networks & Services


AI & Machine Learning

Augmented & Virtual Reality

Consumer Technologies

Distributed & Edge Computing

Location Technologies

Metaverse Markets & Technologies

Satellite Communications

Smart Homes & Buildings

Smart Mobility & Automotive

Smart Urban Infrastructure

Supply Chain Management & Logistics

Sustainable Technologies

Wi-Fi, Bluetooth & Wireless Connectivity


200 million cameras at $10.00 each is a 2 billion dollar market but highly likely much larger as some cameras will be much more expensive but just one percent of a 2 billion dollar market is 20 million dollars.

Because some here are new to why I keep bringing up one percent a little bit of history.

Back at the HC cesspit a couple of the more intelligent worms attacked the idea that Brainchip would capture one hundred percent of the market faced with competition from the tech giants Nvidia, Intel, Samsung, Qualcomm, Google and Apple.

I approached this argument from the perspective that the Edge market both existing and still to be thought of was so huge that even capturing a tiny little one percent of that market was hugely rewarding for Brainchip with a profit margin of 97%.

I took this argument then to another level by issuing a challenge which was to put a compelling argument as to why Brainchip with the first commercial neuromorphic chip and IP in the market and a three year lead over its nearest competitor would not capture one percent of the addressable market.

Strangely no one ever took up this challenge.

Now one further point to consider and which I add to my one percent argument.

Development Time Lines:

We have seen from Sony and Prophesee, Brainchip and Renesas, Brainchip and Socionext that it takes approximately 3 years to develop a product from the point of technology adoption.

So consider this Brainchip stands with a three year technology lead soon to become five years with AKIDA 2.0.

If a company wants to implement a neuromorphic solution today it will be three years before they can buy an alternative solution to AKIDA and then a further three years to develop the desired product.

What this means is that failure to take up an AKIDA solution today by a company will mean a potential delay of six years or more until they can bring their product idea to market.

A company in the technology space that waits six years to capitalise on a product idea will most likely have badly missed the boat.

So while I use the one percent argument I personally like many here have much higher expectations than one tiny percent of the addressable market.

Indeed realistically if you consider Renesas, Socionext, MegaChips and Prophesee as standalone engagements I believe it is reasonable to argue they have secured for Brainchip already a percentage of market share over the next year or so far greater than one percent.

I will leave it to others to ponder the percentage points Edge Impulse, ARM and Intel will add to this secured percentage over the same time frame before Mercedes Benz, Valeo and Ford start to contribute in 2024.

It all comes together to make having a plan for ‘when not if’ compulsory.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Fire
Reactions: 68 users
Top Bottom