BRN Discussion Ongoing

S

Straw

Guest
Hasn't helped my painting before Christmas!
Ya and all my new seedlings which should be powering away are just sitting there with an exclamation bubble above them stating 'What the.......?"
 
  • Haha
  • Like
Reactions: 6 users
I know fact finder always says 1% of this or that but surely BRN can achieve 5% of all that 🤞
All the talk on here sounds like where got a far greater share than that
 

Diogenese

Top 20
Brainchip agrees!
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
 
  • Like
  • Fire
  • Love
Reactions: 56 users

Learning

Learning to the Top 🕵‍♂️
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
So from your perspective Dio,

Should BrainChip position itself into the Intel's Pathfinder Ecosystem also, to spread Brainchip's wings bigger?


Learning
 
  • Like
  • Fire
Reactions: 10 users

RobjHunt

Regular
I totally agree DB.
How the hell making a profit from a falling share price is legal beggers belief because of the obvious - manipulation.
The problem is with shorters' is that over the years they have perfected the art of manipulation and unless there is an exceptional announcement, they will continue to bend us over.................
But what can us poor old retailers do? Don't let the pricks get any of your shares.
All good things come to those who wait.🥳
Baron..
Correct!

Pantene.
 
  • Like
  • Fire
Reactions: 5 users

Diogenese

Top 20
So from your perspective Dio,

Should BrainChip position itself into the Intel's Pathfinder Ecosystem also, to spread Brainchip's wings bigger?


Learning
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
 
  • Like
  • Fire
  • Love
Reactions: 51 users

goodvibes

Regular
Pat Gelsinger, Intel: Our real-time deepfake detection platform uses FakeCatcher technology to analyze biometric signs like 'blood flow' in video pixels—a world-first and a prime example of @Intel's work in responsible AI.

 
  • Like
  • Fire
Reactions: 14 users

Deadpool

Did someone say KFC
Yes. I think the BrainChip/SiFive partnership started a FOMO snowball picking up ARM and Intel with a bit of help from MegaChips.

The Intel Foundry Services IP Alliance:
https://www.intel.com/content/www/us/en/foundry/ifs-accelerator/ip-alliance.html

We could soon see products including Brainchip/ARM SNNs made by Intel Foundry Services (IFS) competing with products including Brainchip/SiFive SNNs made by IFS, and both competing with products including Brainchip/Intel SNNs made by IFS.

Just as Brainchip covered the RISC-IV and RISC-V bases with ARM and SiFive, Intel has now spread its bets to cover the red and the black.
A 3 way Dodgy, I like it, I like it a lot.

Sport Drag GIF by Puppy Bowl
Whatever way you look at it, it's just a win, win, win for BRN
 
  • Haha
  • Like
  • Fire
Reactions: 19 users

Learning

Learning to the Top 🕵‍♂️
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
Thanks Dio,

I really think BrainChip is a suitable candidate to join Intel's Pathfinder Ecosystem, as both of BrainChip's partner; Renesas & SiFive is within the Pathfinder Ecosystem.

As you say, "AI/one-shot on chip ML is all the rage" ❤️


Learning,
Learning everyday.
 
  • Like
  • Fire
Reactions: 17 users
Supersleuthing Learning,

Sifive have been collaborating for "a multi-year history", so this would be a solid launch pad to incorporate Akida.

SiFive

“Intel and SiFive, the founder and leader in RISC-V computing, share a multi-year history of collaboration, and we are pleased that Intel selected the SiFive® PerformanceTM P550 core both as the heart of the Horse Creek development platform and for use with Intel® Pathfinder FPGA-based development tools,” said Phil Dworsky, SiFive Global Head of Strategic Alliances. “Intel® Pathfinder for RISC-V* gives software developers a head start in preparation for the highly anticipated Horse Creek boards, which are on track for delivery this year. SiFive is excited to work with Intel on this project, to engage with mutual customers, and together to fuel innovation in the fast-growing RISC-V ecosystem.”

https://pathfinder.intel.com/news/i...new-capabilities-for-pre-silicon-development/

In fact, they could make an immediate start with the Akida simulator (nee ADE) in MetaTF.

SANTA CLARA, C.A. — August 30, 2022 — Intel® Pathfinder for RISC-V* is launching today to transform the way SOC architects and system software developers define new products. It allows for a variety of RISC-V cores and other IP to be instantiated on FPGA and simulator platforms, with the ability to run industry leading operating systems and tool chains within a unified IDE.

Sifive is still very much the new kid on the block, so having Intel's endorsement will supercharge its market penetration, and AI/one-shot on chip ML is all the rage.

I think we are looking at event horizon spaghettifying acceleration.
Hey @Diogenese

Have you ever done a bit of a dive into iCatch Tech?

The other partner / collaborator of Prophesee who released their info around April this year and we did ours around June.

I was looking into them and one of their news items about the IMX636 which is Sony's sensor, then coupled with Metavision and iCatch V57 SoC.

I found some info sheets as the AI details was a bit light on elsewhere. The Vi37 references CNN if I read right but the V57 and others just reference a NPU IP.

Appears not using low bit weights 8-16(?)

But Akida can run higher though if wanted yeah?

Just musing about any possible way they could use our IP as CNN2SNN on their chip which stacks with Sony IMX636 sensor and uses Metavision SDK or whether we would have to come through the Metavision side?

News link below and site with chip products.

News


Products


Couple snips from news.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.

iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
 
  • Like
  • Fire
Reactions: 12 users
Was just skimming through some recent EI vids on YT and this one uploaded 18hrs ago.

They are doing some modelling on Texas Inst and wasn't about using BRN however spotted MetaTF Model now in Beta mode on the Impulse Studio...woo hoo


IMG_20221214_220813.jpg


Spotted around the 32min mark.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 29 users

Diogenese

Top 20
Hey @Diogenese

Have you ever done a bit of a dive into iCatch Tech?

The other partner / collaborator of Prophesee who released their info around April this year and we did ours around June.

I was looking into them and one of their news items about the IMX636 which is Sony's sensor, then coupled with Metavision and iCatch V57 SoC.

I found some info sheets as the AI details was a bit light on elsewhere. The Vi37 references CNN if I read right but the V57 and others just reference a NPU IP.

Appears not using low bit weights 8-16(?)

But Akida can run higher though if wanted yeah?

Just musing about any possible way they could use our IP as CNN2SNN on their chip which stacks with Sony IMX636 sensor and uses Metavision SDK or whether we would have to come through the Metavision side?

News link below and site with chip products.

News


Products


Couple snips from news.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions ("SSS") stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.

iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on.
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

1671028373723.png




A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 28 users
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

View attachment 24378



A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
Thanks D

Like you, I had a quick patent skim and couldn't find anything to do with AI per se either.

I did notice the CNN reference on the Vi37 but then just NPU on others as I said hence the question on CNN2SNN if viable.

When you say missing link I presume you mean the AI processing component (not us in that IMX) to create that package?

I'll try dig around for earliest Prophesee/ iCatch connection mentions maybe tomoz.

Begs question maybe then of how long working with iCatch and how long working with us before either public news releases and was there any parallel Testing / Dev?
 
  • Like
  • Fire
Reactions: 13 users
Hi Fmf,

I didn't find any NN related patents for icatch.

The V57 is an Image Signal processor which includes an NN core:

View attachment 24378



A group of CV engines which comprise of an optical flow processing (OPF) accelerator, matrix arithmetic engine (MAE) with DSP functions and high-performance NPU engine, can manipulate the image and video data in real-time to support edge computing and intelligent processing, such as pedestrian detection, vechicle detection, distance extraction, behavior recognition, gesture recognition, or even action recognition, and much more. The embedded USB3.2 Gen1 device interface and support ultra high-speed and high-resolution video data transfer.



2.3. Image process acceleration engine

 Matrix operation engines

 Scaling up/down engine

 De-warping engine for lens distortion correction (LDC) and multi[1]view affine transformation

 Motion detection engine

 HW accelerated optical flow engine for DVS sensor applications

 Pre/Post processing for NPU acceleration



2.5. Neural network accelerator

 High performance 1.2 TOPS NPU engine

Supports weight/bias quantization using UINT8, INT8, INT16,

Float16, BFloat16 and Post-training quantization

 MAE engine – pre/post DSP accelerator



It does not include SNN.

It would be a case of either/or as far as the iCatch NN is concerned.

They would have been working with Prophesee for some time


https://www.prophesee.ai/2022/04/19/icatch-prophesee-collaboration-ai-vision-processor/

iCatch and Prophesee collaborated on development of AI vision processor natively compatible with Prophesee Event-based Metavision® sensing technologies.​


iCatch Technology (iCatch) has focused on image signal processing (ISP) technology and Camera SOC for over two decades and aggressively invested in research and development in more machine learning (ML) related technology and application.

Now iCatch has collaborated with Prophesee on Event-based Metavision sensing projects that integrated iCatch’s V57 AI vision processor with the new Sony Semiconductor Solutions (“SSS”) stacked Event-based Vision Sensor IMX636, realized in collaboration between SSS and Prophesee.
iCatch built the OpenMVCam development platform for all algorithm partners and ODM customers to design a variety of AI products, systems and applications on many market segments like surveillance, smart healthcare, in-cabin monitor system, smart home, industrial automation, smart city and so on.

iCatch also provides a high customized service and an excellent image quality system to become the eyes of all of machine vision equipment and smart devices in the future.

SSS’s Event-based Vision Sensor can be the input source for iCatch AI vision processor, based on iCatch built-in NPU acceleration engine and proprietary CV processing hardware engine. It can integrate Prophesee Metavision Intelligence SDK and other 3rd party’s machine vision algorithms to support end customers’ AI vision applications such as DMS and OMS in automotive in-cabin, patient/elder fall detection in home healthcare and hospitals, intruder detection in home surveillance, anomaly detection in industrial automation, gesture control in smart home appliance, eye tracking in AR/VR and so on
.

So it looks like you have found the missing link in the Sony/Prophesee/? triangle.

But it was after this that Prophesee confessed its undying love for Akida. So, as I speculated when the Sony/Prophesee collaboration was announced, unfortunately they are probably contractually bound to iCatch for the first born.

So, as someone speculated above, was the Apple CEO inspecting iCatch's offspring, or was there a newer and better love child in the cradle?

You've got to ask yourself one question - would Sony or Prophesee be happy showing off an inferior solution?
Just found these earlier references and distribution partner. No mention of iCatch?

From


Linked to here



IMX636 and IMX637 event-based vision
29 November 2021

Macnica ATD Europe today announced to offer the Event-based vision sensor (“EVS”) from its long-term distribution partner, Sony Semiconductor Solutions Corporation (“Sony”). The two sensors IMX636 and IMX637 were made possible through collaboration between Sony and Prophesee, another distribution partner of Macnica ATD Europe.

EVS realizes high-speed data output with low latency by limiting the output data to luminance changes from each pixel, combined with information on pixel position coordinates and time. Only the pixels that have detected a change in luminance for the object can output data, allowing the sensor to immediately detect the luminance changes with high-speed, low-latency, high-temporal-resolution while operating with low power consumption. It represents a whole new approach compared to the commonly used frame-based method, where the entire image is output at certain intervals determined by the frame rate.

Application fields where the two new models of EVS specifically exploit their advantages over frame-based sensors are for example sensing changes in sparks produced during welding and metal cutting; and sensing slight changes in vibration and detecting abnormalities for use in predictive maintenance.

Macnica ATD Europe as a prime distribution partner of both Sony and Prophesee covers the full cooperation on the EVS technology and offers technical support as well as the free loan of evaluation kits in different versions from first “hands on” evaluation to full performance evaluation (available in Q4). The cooperation also covers the Metavision® Intelligence Suite from Prophesee, an event signal processing software optimized for the sensors performance, which is available through Macnica ATD Europe. Combining Sony’s event-based vision sensors with this software will enable efficient application development and provide solutions for various use cases.

To here


Partners

 
  • Like
  • Fire
  • Love
Reactions: 15 users
@Diogenese

Maybe the answer.

In 2020 started using VeriSilicon NPU for their latest V37 at the time.

Would presume has just been iterations of same since.

Haven't looked into the VeriSilicon capabilities yet. Time for 💤



VeriSilicon VIP9000 and ZSP are Adopted by iCatch Next Generation AI-powered Automotive Image Processing SoC

Shanghai, China, May 12, 2020 – VeriSilicon today announced that iCatch Technology, Inc. (TPEX: 6695), a global leader in low-power and intelligent image processing SoC solutions, has selected VeriSilicon VIP9000 NPU and ZSPNano DSP IP. Both will be utilized in the iCatch’s next generation AI-powered image processing SoC with embedded neural network accelerators powered by VeriSilicon’s NPU for applications such as automotive electronics, industrial, appliance, consumer electronics, AIoT, smart home, commercial and more.

VIP9000 is a highly scalable and programmable processor IP for computer vision and artificial intelligence applications and with support for all popular deep learning frameworks (TensorFlow, Pytorch, TensorFlow Lite, Caffe, Caffe2, DarkNet, ONNX, NNEF, Keras…), as well as OpenCLTM and OpenVXTM APIs. Neural network optimization techniques such as quantization, pruning, and model compression are also supported natively by VIP9000 architecture. AI applications can be easily ported to VIP9000 platforms through offline conversion by the Vivante AcuityTM SDK, or through run-time interpretation with Android NN, NN API, or ARM NN. The ZSPNano DSP provides complete programmability, industry leading power efficiency with a robust and mature SDK, enabling iCatch to extend audio and voice software applications smoothly including Acoustic Echo Cancelation (AEC), Noise Suppression and Beam Forming.

“By incorporating VeriSilicon key IP capabilities, our V37, the latest generation AI-powered image processing SoC for 4K video, can empower the edge computing capability right on the camera. Together with the embedded iCatch’s high quality image signal processing (ISP) engine and proprietary CV accelerator engine, we can drive the VeriSilicon NPU engine to the best TOPS/W performance. As a result, our AI-powered image processing SoCs not only can record high-quality video, but also perform advanced image analytics to bring intelligence to the camera devices. We successfully applied our AI-powered image processing SoCs into various applications.” said Weber Hsu, President of iCatch Technology, Inc.

“Edge AI computing has been applied in a wide range of market segments including consumer, industrial and automotive devices, especially in computer vision related applications. Image processors with built-in AI NPU have great growth momentum,” said Weijin Dai, Executive Vice President and GM of Intellectual Property Division at VeriSilicon.

“Smart eye and smart ear are essential for all the intelligent edge devices. The challenge is how to deliver the intelligence efficiently within the target product power envelope and silicon budget,” added Weijin Dai. “As a leading company in image processing, iCatch will greatly leverage our NPU and DSP in advanced solutions to perfectly address the enormous market needs. With VIP9000 and ZSPNano, iCatch’s image processing SoCs deliver advanced AI-Vision and AI-Voice capabilities at the lowest power consumptions.”
 
  • Fire
  • Like
  • Love
Reactions: 17 users

alwaysgreen

Top 20
  • Like
  • Haha
  • Fire
Reactions: 10 users
  • Like
  • Love
  • Fire
Reactions: 21 users

alwaysgreen

Top 20
  • Like
  • Haha
Reactions: 5 users

Deadpool

Did someone say KFC

The fool is at it again Mickleboro just can't be humble in his defeat.

"The jury is still out on whether Brainchip will ever successfully commercialise its technology". What a 🐓head


Onwards and upwards Brainchip
 
  • Like
  • Haha
  • Wow
Reactions: 21 users
Top Bottom