BRN Discussion Ongoing

Adam

Regular
Hey GUYS

I'M BACK. long story and i have so much catching up to do. But all is well now and I'm back :) If someone can summarise in a nutshell what i missed :cool:
We're all here. Everyone is doing great dot joining to 5 million companies. Some downrampers, but, as a LTH, I'm gonna wait till the NDAs are signed off, and we see a formal statement from Brainchip, then, we buy the champagne. Here endeth the sermon.
 
  • Like
  • Fire
Reactions: 11 users

Getupthere

Regular

Home » AI » Consumer Products Could Soon Feature Neuromorphic Sensing

Consumer Products Could Soon Feature Neuromorphic Sensing​

Article By : Sally Ward-Foxton​

Prophesee-Sony-Sensor_600.jpg

In this interview with Prophesee CEO Luca Verre, we discuss Prophesee's approach to commercializing its retina-inspired sensors and where the technology will go from here.
What does “neuromorphic” mean today?
“You will get 10 different answers from 10 different people,” laughed Luca Verre, CEO of Prophesee. “As companies take the step from ‘this is what we believe’ to ‘how can we make this a reality,’ what neuromorphic means will change.”
Most companies doing neuromorphic sensing and computing have a similar vision in mind, he said, but implementations and strategies will be different based on varying product, market, and investment constraints.
【Download】The QA exchange deck in Solido Crosscheck enables an IP qualification handshake
“The reason why… all these companies are working [on neuromorphic technologies] is because there is a fundamental belief that the biological model has superior characteristics compared to the conventional,” he said. “People make different assumptions on product, on system integration, on business opportunities, and they make different implementations… But fundamentally, the belief is the same.”
Prophesee CEO Luca Verre Luca Verre (Source: Prophesee)
Verre’s vision is that neuromorphic technologies can bring technology closer to human beings, which ultimately makes for a more immersive experience and allows technologies such as autonomous driving and augmented reality to be adopted faster.
“When people understand the technology behind it is closer to the way we work, and fundamentally natural, this is an incredible source of reassurance,” he said.
Which markets first?
Prophesee is already several years into its mission to commercialize the event–based camera using its proprietary dynamic vision sensor technology. The company has collaborated with camera leader Sony to make a compact, high–resolution event–based camera module, the IMX 636. In this module, the photodiode layer is stacked directly on top of the CMOS layer using Sony’s 3D die stacking process.
According to Verre, the sector closest to commercial adoption of this technology is industrial machine vision.
“Industrial is a leading segment today because historically we pushed our third–generation camera into this segment, which was a bigger sensor and more tuned for this type of application,” he said. “Industrial has historically been a very active machine vision segment, in fact, it is probably one of the segments that adopted the CCD and CMOS technologies at the very beginning… definitely a key market.”
Prophesee Sony IMX 636 event-based camera Prophesee and Sony IMX 636 is a fourth-generation product. Prophesee said future generations will reduce pixel pitch and ease integration with conventional computing platforms (Source: Prophesee)
The second key market for the IMX 636 is consumer technologies, driven by the shrink in size enabled by Sony’s die–stacking process. Consumer applications include IoT cameras, surveillance cameras, action cameras, drones, consumer robots, and even smartphones. In many cases, the event–based camera is used alongside a full–frame camera, detecting motion so that image processing can be applied to capture better quality images, even when the subject is moving.
“The reason is very simple: event–based cameras are great to understand motion,” he said. “This is what they are meant for. Frame–based cameras are more suited to understanding static information. The combination of the dynamic information from an event–based camera and static information from a frame–based camera is complementary if you want to capture a picture or video in a scene where there’s something moving.”
Event data can be combined with full–frame images to correct any blur on the frame, especially for action cameras and surveillance cameras.
“We clearly see some traction in this in this area, which of course is very promising because the volume typically associated with this application can be quite substantial compared to industrial vision,” he said.
Prophesee is also working with a customer on automotive driver monitoring solutions, where Verre said event–based cameras bring advantages in terms of low light performance, sensitivity, and fast detection. Applications here include eye blinking detection, tracking or face tracking, and micro–expression detection.
Approach to commercialization
Prophesee's EV4 evaluation kit Prophesee’s EV4 evaluation kit (Source: Prophesee)
Prophesee has been working hard on driving commercialization of event–based cameras. The company recently released a new evaluation kit (EVK4) for the IMX 636. This kit is designed for industrial vision with a rugged housing but will work for all applications (Verre said several hundred of these kits have been sold). The company’s Metavision SDK for event–based vision has also recently been open–sourced in order to reduce friction in the adoption of event–based technology. The Metavision community has around 5,000 registered members today.
“The EDK is a great tool to further push and evangelize the technology, and it comes in a very typical form factor,” Verre said. “The SDK hides the perception of complexity that every engineer or researcher may have when testing or exploring a new technology… Think about engineers that have been working for a couple of decades on processing images that now see events… they don’t want to be stretched too much out of their comfort zone.”
New to the Metavision SDK is a simulator to convert full frames into events to help designers transition between the way they work today and the event domain. Noting a reluctance of some designers to move away from full frames, Verre said the simulator is intended to show them there’s nothing magic about events.
“[Events are] just a way of capturing information from the scene that contains much more temporal precision compared to images, and is actually much more relevant, because typically you get only what is changing,” he said.
How Prophesee's event-based cameras work How Prophesee’s event-based cameras work (click to enlarge) (Source: Prophesee)
The simulator can also reconstruct image full frames from event data, which he says people find reassuring.
“The majority of customers don’t pose this challenge any longer because they understand that they need to see from a different perspective, similar to when they use technology like time of flight or ultrasound,” he said. “The challenge is when their perception is that this is another image sensor… for this category of customer, we made this tool that can show them the way to transition stepwise to this new sensing modality… it is a mindset shift that may take some time, but it will come.”
Applications realized in the Prophesee developer community include restoring some sight for the blind, detecting and classifying contaminants in medical samples, particle tracking in research, robotic touch sensors, and tracking space debris.
Hardware roadmap
In terms of roadmap, Prophesee plans to continue development of both hardware and software, alongside new evaluation kits, development kits, and reference designs. This may include system reference designs which combine Prohpesee sensors with specially developed processors. For example, Prohpesee partner iCatch has developed an AI vision processor SoC that interfaces natively with the IMX 636 and features an on–chip event decoder. Japanese AI core provider DMP is also working with Prophesee on an FPGA–based system, and there are more partnerships in the works, said Verre.
“We see that there is growing interest from ecosystem partners at the SoC level, but also the software level, that are interested in building new solutions based on Prophesee technology,” he said. “This type of asset is important for the community, because it is another step towards the full solution — they can get the sensor, camera, computing platform, and software to develop an entire solution.”
Prophesee event-based camera roadmap The evolution of Prophesee’s event-based vision sensors (click to enlarge) (Source: Prophesee)
Where does event–based sensor hardware go from here? Verre cited two key directions the technology will move in. The first is further reduction of pixel size (pixel pitch) and overall reduction of the sensor to make it suitable for compact consumer applications such as wearables. The second is facilitating the integration of event–based sensing with conventional SoC platforms.
Working with computing companies will be critically important to ensure next–generation sensors natively embed the capability to interface with the computing platform, which simplifies the task at the system level. The result will be smarter sensors, with added intelligence at the sensor level.
“We think events make sense, so let’s do more pre-processing inside the sensor itself, because it’s where you can make the least compromise,” Verre said. “The closer you get to the acquisition of the information, the better off you are in terms of efficiency and low latency. You also avoid the need to encode and transmit the data. So this is something that we are pursuing.”
As foundries continue to make progress in the 3D stacking process, stacking in two or even three layers using the most advanced CMOS processes can help bring more intelligence down to the pixel level.
How much intelligence in the pixel is the right amount?
Verre said it’s a compromise between increasing the cost of silicon and having sufficient intelligence to make sure the interface with conventional computing platforms is good enough.
“Sensors don’t typically use advanced process nodes, 28nm or 22nm at most,” he said. “Mainstream SoCs use 12nm, 7nm, 5nm, and below, so they’re on technology nodes that can compress the digital component extremely well. The size versus cost equation means at a certain point it’s more efficient, more economical [to put the intelligence] in the SoC.”
Applications for Prophesee event-based cameras A selection of applications for Prophesee event-based cameras (click to enlarge) (Source: Prophesee)
There is also a certain synergy to combining event–based sensors with neuromorphic computing architectures.
“The ultimate goal of neuromorphic technology is to have both the sensing and processing neuromorphic or event–based, but we are not yet there in terms of maturity of this type of solution,” he said. “We are very active in this area to prepare for the future — we are working with Intel, SynSense, and other partners in this area — but in the short term, the mainstream market is occupied by conventional SoC platforms.
Prophesee’s approach here is pragmatic. Verre said the company’s aim is to try to minimize any compromises to deliver benefits that are superior to conventional solutions.
“Ultimately we believe that events should naturally stream asynchronously to a compute architecture that is also asynchronous in order to benefit fully in terms of latency and power,” he said. “But we need to be pragmatic and stage this evolution, and really capitalize on the existing platforms out there and work with key partners in this space that are willing to invest in software–hardware developments and to optimize certain solution for certain markets.”
This article was originally published on EE Times.
Sally Ward-Foxton covers AI technology and related issues for EETimes.com and all aspects of the European industry for EE Times Europe magazine. Sally has spent more than 15 years writing about the electronics industry from London, UK. She has written for Electronic Design, ECN, Electronic Specifier: Design, Components in Electronics, and many more. She holds a Masters’ degree in Electrical and Electronic Engineering from the University of Cambridge.
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Earlyrelease

Regular
Well my takeaway from today must be the definition of the employment term for hybrid.
I look forward to applying this to my retirement plans. Thus a $10 SP, I will swap between sightseeing, indulging in fine food and drinks between a variety of workplaces aka- locations which may or may not be restricted the WA hopefully consisting of palms trees beaches and stunning scenery.
🌴🛩️🛳️🏕️⛩️🛤️🕌
 
  • Like
  • Love
  • Fire
Reactions: 19 users

chapman89

Founding Member
Put your happy pants back on - April 2022 was before June 2022 if I remember correctly:-

https://www.prophesee.ai/2022/06/20/brainchip-partners-with-prophesee/

BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency​


Laguna Hills, Calif. – June 14, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world’s most advanced neuromorphic vision systems, today announced a technology partnership that delivers next-generation platforms for OEMs looking to integrate event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.
Hi Dio, luca told me back in October 2022 that the relationship between Prophesee & Brainchip started back in 2021…..despite the offical announcement in June 2022.
DDB8F93A-D43F-4CAD-B0D8-F4B4E660B0B5.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 64 users

buena suerte :-)

BOB Bank of Brainchip
Sorry....off piste, but I just witnessed a new Australian tennis star win just then. Alexie Popyrin. Amazing game, amazingly humble.
Absolutely Dhm.... What a game ... Great Lad .. looking forward to his next match:cool:
 
  • Like
  • Fire
Reactions: 8 users

Boab

I wish I could paint like Vincent
“A ‘Level 3’ system (semi-autonomous system), whether at 60, 80 or 120 kilometres per hour, which constantly switches off in the tunnel, switches off when it rains, switches off in the dark, switches off in fog – what’s that supposed to mean? No customer buys it.

I know a Co that has the solution.

 
  • Like
  • Fire
  • Haha
Reactions: 16 users

Jumpchooks

Regular
 
  • Like
  • Haha
  • Love
Reactions: 17 users

skutza

Regular

They still need AKIDA desperately though. If he says get my tools, he may get the bag or he may get the tool from Motley fool. One shot learning to tell the difference between tools!
 
  • Like
  • Haha
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Put your happy pants back on - April 2022 was before June 2022 if I remember correctly:-

https://www.prophesee.ai/2022/06/20/brainchip-partners-with-prophesee/

BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency​


Laguna Hills, Calif. – June 14, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world’s most advanced neuromorphic vision systems, today announced a technology partnership that delivers next-generation platforms for OEMs looking to integrate event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.

Dear Dodgy-knees,

I just took my PJ's off and put my happy pants back on and I also took the cork back out of the champers just in case! Here are two extracts from an article dated Thursday 6 October 2022 which are leading me to think, hope and pray that iCatch may have incorporated Akida. If @Stable Genius is right when he said iCatch are "SOC designers so my understanding is they still needed to get their IP from somewhere", then where else would they be getting their IP that can perform all of the bits and bobs outlined in the two extracts below? See the link at the bottom of the post for the full article.

I mean, does Sony Semiconductor Solutions have its own AI Deep Learning Accelerator? Can it protect the driver's privacy, monitor their attentiveness and their health status through multi-sensor fusion and AI edge computing and even take control over the car when needed?

Screen Shot 2023-01-19 at 10.44.00 pm.png

Screen Shot 2023-01-19 at 10.45.35 pm.png


I don't want to get too excited. Heck, who am I kidding? I wan't to get REALLY, REALLY EXCITED!!!!! 🥳


 
  • Like
  • Love
  • Fire
Reactions: 39 users

Sirod69

bavarian girl ;-)
don´t know, if we are involved? Thought it´s perhaps interesting for some of you?🙋‍♀️


DIYA SOUBRA

DIYA SOUBRA• 2.Market Entry Wizard at Arm - IoT, Edge Compute, Machine learning, Smart Home, Matter, Ambient AI, Digital Transformation


Apple released #Matter smart home hub

PRESS RELEASEJanuary 18, 2023

Apple introduces the new HomePod with breakthrough sound and intelligence​

Delivering incredible audio quality, enhanced Siri capabilities, and a safe and secure smart home experience

 
  • Like
  • Thinking
  • Fire
Reactions: 14 users

Diogenese

Top 20
Hi Dio, luca told me back in October 2022 that the relationship between Prophesee & Brainchip started back in 2021…..despite the offical announcement in June 2022. View attachment 27512
Thanks Jesse,

I remember when you posted this now - the dendrites had not connected before.

Of course they would have been in contact befoe the announcement, so at least 6 monthe earlier.

Anil's comment in June 2022 was:
“We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

This reads like they had not actually soldered the Prophesee DVS to the Akida SoC, but had run a Prophesee DVS data file on Akida SoC, and the performance was "impressive".

Similarly, they would have been in contact with iChat for some time before.

https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

"Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee."

They had run the iCatch V57 AI chip with the Sony/Prophesee Event-based Vision Sensor IMX636. This would have entailed some physical circuit design to connect the processor to the DVs chip, so the start of cooperation between Prophesee and iChat would have pre-dated the Prophesee/BrainCHip hook-up.

So I checked on the iChat V57 AI processor which was used with Prophesee:
https://www.icatchtek.com/Upload/202207251104172758601.pdf

A group of CV (computer vision) engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.

2.5. Neural network accelerator

  • High performance 1.2 TOPS NPU engine
  • Supports weight/bias quantization using UINT8, INT8, INT16,
    • Float16, BFloat16 and Post-training quantization
  • MAE engine – pre/post DSP accelerator
iChat are using 8-bit and 16-bit MAC, so several times slower and more power hungry than Akida.

As Luca Verre said:

"By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee."
 
  • Like
  • Love
  • Fire
Reactions: 55 users

Diogenese

Top 20
Dear Dodgy-knees,

I just took my PJ's off and put my happy pants back on and I also took the cork back out of the champers just in case! Here are two extracts from an article dated Thursday 6 October 2022 which are leading me to think, hope and pray that iCatch may have incorporated Akida. If @Stable Genius is right when he said iCatch are "SOC designers so my understanding is they still needed to get their IP from somewhere", then where else would they be getting their IP that can perform all of the bits and bobs outlined in the two extracts below? See the link at the bottom of the post for the full article.

I mean, does Sony Semiconductor Solutions have its own AI Deep Learning Accelerator? Can it protect the driver's privacy, monitor their attentiveness and their health status through multi-sensor fusion and AI edge computing and even take control over the car when needed?

View attachment 27513
View attachment 27514

I don't want to get too excited. Heck, who am I kidding? I wan't to get REALLY, REALLY EXCITED!!!!! 🥳


Hi Bravo,

By all means keep the champers flowing and keep your happy pants on, but I'm afraid @Stable Genius is leading you up the garden path.

iCatch have a very complex chip, the V57 AI processor (see above), which includes their proprietary NPU.

Unfortunately for them, they have used a design from the last millennium for their NPU.

Bottoms up!
1674131153697.png


PS: If iCatch have a contract with Sony, we may have to wait for the contract to expire, so drink slowly.
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 39 users
Hi Bravo,

By all means keep the champers flowing and keep your happy pants on, but I'm afraid @Stable Genius is leading you up the garden path.

iCatch have a very complex chip, the V57 AI processor (see above), which includes their proprietary NPU.

Unfortunately for them, they have used a design from the last millennium for their NPU.

Bottoms up!
View attachment 27525

Thanks @Diogenese.

I give up. Going back to shutting my mouth, reading and learning for a while. I’m out of my depth!

Edit: deleted my last post as it was wrong info.
:cry:
 
Last edited:
  • Like
  • Love
Reactions: 14 users

Diogenese

Top 20
  • Haha
  • Like
  • Love
Reactions: 28 users

Sirod69

bavarian girl ;-)
Tim Llewellynn
Status: online
Tim Llewellynn• 1.CEO/Co-Founder of NVISO Human Behaviour AI | President Bonseyes Community Association | Coordinator Bonseyes AI Marketplace | IBM Beacon Award Winner #metaverse #edgeai #decentralizedai
6 Min. • vor 6 Minuten


#neuromorphic hardware is going to be a key enabler for AI software. Delivering exceptional customer experiences of future of robots and cars that are safe, secure, and eco-friendly. Its great to see successful pioneers like Mercedes-Benz AG and their leaders Markus Schäfer to take the "bull by the horns" to see how these technologies could transform their industry. At NVISO, as a pure AI software company, we are a strong believer of these next-generation ultra-low power hardware technologies and are excited that the #extremeedge continues to gather momentum within the automotive industry.

#leaders #ai #future #software #automotiveindustry

Tweet​


NVISO
@nviso

#neuromorphic hardware is going to be a key enabler for AI software. Delivering exceptional customer experiences of future of robots and cars that are safe, secure, and eco-friendly. Its great to see successful pioneers like Mercedes-Benz AG and their lea…
1674132380116.png
 
  • Like
  • Fire
  • Love
Reactions: 41 users

cassip

Regular
SP today like in AUS weaker, ~ -3%, volume lower again.

Yesterday I came back to Rambus once again concerning the security topic which seems to be one of the most important factors. Lots of research to do, for those who are interested some hints:

„Many Rambus security IP solutions have been certified to meet the FIPS 140, ISO 26262, and Common Criteria standards.“


For example, automotive:

„Vehicle systems and the semiconductors used within them represent some of today’s most complex electronics. In the drive to autonomous vehicles, increasingly sophisticated electronic systems are being developed for powertrain and vehicle dynamics, advanced driver assistance systems, vehicle-to-everything connectivity, infotainment, and in-vehicle experience. In addition to higher levels of performance, these systems must meet automotive functional safety requirements as specified by ISO 26262.“

 
  • Like
  • Fire
Reactions: 14 users

cassip

Regular
Anil Mankar was asked at the 2021 Ai Field Day if Brainchip was obtaining ASIL approval for AKIDA and he said no that is too expensive we are leaving it for our customers well once again the truth of Brainchip statements is highlighted by Socionext’s reveal of ISO26262
Automotive Safety Approval.

This milestone approval is in itself huge as it confirms that Mercedes Benz and others will have the ability to expand AKIDA’s use to the full extent technically possible.

My opinion only DYOR
FF

AKIDA BALLISTA
P.S.: see FFs post of december
 
  • Fire
  • Like
Reactions: 10 users

Diogenese

Top 20
The Qualcomm video today is doing my head in. I lack the technical knowledge or experience but I get obsessed with finding answers so I keep researching to get a result, which on this subject is to me like trying to read a book in an unknown foreign language.

On the Qualcomms website there are numerous new videos showing what the new Snapdragon 8 can do and the features are remarkably similar to Akida’s. If it’s not us then it‘s unfortunately a good competitor as they already have a large market share and have sold 2Billion AI products to date.

This is their webpage which has the video’s and describes their latest Snapdragon 8 gen 2.




The article I have pasted below is from Qualcomm dated 17/11/22 and discusses the Hexagon AI processor which in researching goes back many years so it’s not new technology but likely improved year on year

Traditionally, laptop performance has been measured by CPU and GPU, but on-device AI processing is now a critical third measure. The integrated Qualcomm Hexagon AI processor in Snapdragon compute platforms offloads compute intensive AI tasks from the CPU and GPU to dramatically increase performance on the device and deliver new user experiences. In fact, we're excited to share that with the pre-release Procyon AI Inference Benchmark, results show the Hexagon AI processor scores up to 5x faster than the competitor's CPU and up to 2.5x faster than the competitor's GPU.





This article below is so far the best I have found and discusses both neural networks and transformers however our Akida 2.0 which was possibly going to contain the transformers has not been made yet so that rules that out from being on the new snapdragon 8 gen 2.

Qualcomm Snapdragon 8 Gen 2 Delivers More AI For Mobile​

Jim McGregor
Contributor
Tirias Research
Contributor Group
0

New! Click On The Conversation Bubble To Join The Conversation Got it!
Nov 16, 2022,04:41am EST
https://policies.google.com/privacy

Listen to article6 minutes

Snapdragon 8 Gen 2 splash logo

The Qualcomm Snapdragon 8 gen 2 SoC for flagship smartphones
QUALCOMM
The Snapdragon Tech Summit is a multi-day event that showcases the latest mobile technology Qualcomm has to offer. This is the second year that Qualcomm has held simultaneous events in China and Hawaii, as well as streaming the keynote addresses. Day 1 of the Snapdragon Tech Summit kicked off with the introduction of the latest smartphone system-on-chip (SoC) for smartphones – the Snapdragon 8 Gen 2. As expected, it delivers improvements in performance and efficiency for camera, connectivity, gaming, sound, and security. But the biggest punch comes from the use of artificial intelligence (AI) in just about every area. The company went so far as to call it “purpose built for AI.”


Qualcomm uses all of the Snapdragon SoC’s processing elements for AI processing and calls the combination of these processing elements the “AI engine.” Among the enhancements incorporated into the AI engine was a dedicated power plane and a doubling of the tensor processing cores within the Hexagon processor. The result is a 4.35x improvement in performance and an equally impressive 60% improvement in performance per watt efficiency. Qualcomm also added support for transformer neural network models which are critical for applications like natural language processing (NLP) for speech-to-text and text-to-speech translation. The Hexagon can splice the neural NLP model into smaller elements to run on micro tiles allowing for more efficient use of the processing cores. Additionally, Qualcomm added support for Int4 data structures. In many cases, this lower precision data structure can be used by neural network models like computational photography image enhancement without a noticeable loss of accuracy while improving speed and power efficiency. The end result is faster and more efficient processing of neural network models.

With the Snapdragon 8 Gen 2, Qualcomm is also introducing the Hexagon Direct Link – a physical link between the Hexagon processor and the other AI processing cores for fast, real-time processing. According to the company, this will enable the first cognitive processing by the image signal processor (ISP) in the Snapdragon 8 Gen 2.


@Diogenese
I saw Qualcomm put out a white paper trying to standardise 8 bits whereas PVDM did a blog recently saying 4 bits is enough. I THINK the part I bolded above where Qualcomm added support for Int4 data structures means they added support for 4 bits which is contrary to their own white paper and I THINK supports what PVDM neural networks plan is?

That‘s all I’ve got for dots to support Akida bring in Snapdragon 8 Gen 2 but contradictory info re transformer NN as we haven’t made them yet.

Hope it makes sense.

As @Adam said earlier tonight: I might be better off, rather than trying to do all this dot joining to just wait for the announcement. Wise words!
Hi SG,

Qualcomm talk a good processor.

Snapdragon 8.2 has AI acceleration based on Qualcomm Hexagon processor hardware.


Artificial Intelligence:

Qualcomm® Adreno™ GPU

Qualcomm® Kryo™ CPU

Qualcomm® Hexagon™ Processor

• Fused AI Accelerator Architecture

• Hexagon Tensor Accelerator

• Hexagon Vector eXtensions

• Hexagon Scalar Accelerator

• Hexagon Direct Link

• Support for mix precision (INT8+INT16)

Support for all precisions (INT4, INT8, INT16, FP16)

• Micro Tile Inferencing

Qualcomm® Sensing Hub

• Dual AI Processors for audio and sensors

• Always-Sensing camera


Groundbreaking AI:

Our AI Engine includes the Qualcomm® Hexagon™ Processor, with revolutionary micro tile inferencing and faster Tensor accelerators for up to 4.35x1 faster AI performance than its predecessor. Plus,
support for INT4 precision boosts performance-per-watt by 60% for sustained AI inferencing.


The use of 8-bit and upwards indicates the use of ALU/MAC (Multiply Accumulate) circuit, which is probably CNN, and certainly not SNN.

When ARM and SiFive incorporate Akida, they will probably have better comparative performance.

Intel's IFS is already offering Akida IP.

Socionext already have a CPU/GPU/SNN processor with Akida IP on the drawing board.

If Qualcomm lose their legal stoush with ARM, having missed the Akida bus, they could be well behind the 8 ball.
 
  • Like
  • Fire
  • Love
Reactions: 43 users

White Horse

Regular
Hi All.
Almost, too much dot joining gymnastics to absorb today.
As far as Prophesee's business with Sony and Qualcomm is concerned, if Prophesee are as keen on our tech, as they proclaim, I can't see their customers settling for second best. Especially if it is conceived that in doing so, would put them at a competitive disadvantage.
If they are of the same opinion as Prophesee regarding the perfect union of the Akida technology, why would they let Prophesee talk them into an inferior product.
I'm sure that Sony would find a way around any contracts that are not in their best interests.
 
  • Like
  • Love
  • Fire
Reactions: 27 users

Sirod69

bavarian girl ;-)
Markus Schäfer
Markus Schäfer• Follower:inMitglied des Vorstands der Mercedes-Benz Group AG, Chief Technology Officer, Entwicklung & Einkauf
1 Std. • vor 1 Stunde


I took part in a fascinating panel discussion today on the important topic of traffic safety in India, which has the worst record in the world for road crashes and accounts for 10% of global accident fatalities.

Global Sustainability Dialogue India is part of a series of events we are holding around the world to address hot topics relating to sustainability. At each of these, my board colleagues Ola Källenius, Renata Jungo Brüngger and I are joined by invited speakers and panellists from the private and public sector as well as academia.

Participating digitally in the live event in Delhi, I teamed up with digital data entrepreneur Pramad Jandhyala, Professor Geetam Tiwari from IIT Delhi and my colleague Jochen Feese from our Mercedes accident research department to explore the opportunities for leveraging data and technology to deliver sustainable change.

I talked a lot here about the role of digitalisation, connectivity and automation in improving traffic safety. However, in a country where the fleet of intelligent vehicles is still tiny and smart infrastructure in its infancy, we take a different approach. And data is a crucial part of that. Data we have gathered over many years from our work in other regions can combine with localised data to help build a bigger picture, target action and inform decisions.

India’s dynamic start-up ecosystem can also make an important contribution. My team at Mercedes-Benz Research and Development India headed by Manu Saale works extensively with local start-ups on a wide range of initiatives, with safety being one of the top priorities. I believe that innovation based on a combination of global expertise and granular ideas (not to mention a bit of lateral thinking) can produce truly meaningful outcomes.

And when it comes to changing habits, data can help authorities make a compelling case for the effectiveness of straightforward measures such as seatbelts, helmets and airbags. In fact, #MercedesBenz has been running its SAFE ROADS Initiative in India since 2015 to help raise awareness of road safety among young people.

How do you think data and technology can bring change to traffic safety in India? Let me know your thoughts in the comments.

#TrafficSafety #Technology #Innovation
 
  • Like
  • Love
Reactions: 18 users
Top Bottom