BRN Discussion Ongoing

Evermont

Stealth Mode
1671058208454.png
 
  • Like
Reactions: 11 users

wilzy123

Founding Member
ANNNNNNN!!!!!!!!!!

1671058259762.png


Last time a "Notification regarding unquoted securities" ANN was released (8th Dec)... the market took it well. :ROFLMAO::ROFLMAO::cool:

1671058400623.png
 
  • Like
  • Haha
  • Love
Reactions: 21 users

Damo4

Regular
FF ,
Codes and associated employees known to date :
BRNAM - Anil , Ken , Sean , Antonio
BRNAJ - Peter , Pia
BRNAD - admin

just forwarding this again in relation to the latest ann
thanks @Doz


Codes and associated employees known to date :

BRNAM - Anil , Ken , Sean , Antonio

BRNAJ - Peter , Pia

BRNAD - admin
 
  • Like
Reactions: 17 users
just forwarding this again in relation to the latest ann
thanks @Doz


Codes and associated employees known to date :

BRNAM - Anil , Ken , Sean , Antonio

BRNAJ - Peter , Pia

BRNAD - admin
Mother fuckers probably got a partnership with AMD and nvidia on the same day hence the RSUs
 
  • Haha
  • Like
Reactions: 4 users

Dozzaman1977

Regular
just forwarding this again in relation to the latest ann
thanks @Doz


Codes and associated employees known to date :

BRNAM - Anil , Ken , Sean , Antonio

BRNAJ - Peter , Pia

BRNAD - admin
So this new announcement states that none of these shares have been issued to key management personnel (kmp) but is coded BRNAM

I'd assume Anil, Ken, Sean and Antonio are key management so who did the shares go too....... Maybe xmas bonus for all the employees for the results they have been achieving???

Curb Your Enthusiasm GIF
 
  • Like
  • Thinking
  • Love
Reactions: 9 users

Learning

Learning to the Top 🕵‍♂️
So this new announcement states that none of these shares have been issued to key management personnel (kmp) but is coded BRNAM

I'd assume Anil, Ken, Sean and Antonio are key management so who did the shares go too....... Maybe xmas bonus for all the employees for the results they have been achieving???

Curb Your Enthusiasm GIF
I too read this announcement as a positive.

BRNAM seem like AM = America staff. As Anil, Ken, Sean and Antonio is base in America.

So maybe the sale teams are getting the bonus (JMO)

Learning
 
  • Like
  • Love
  • Thinking
Reactions: 12 users

equanimous

Norse clairvoyant shapeshifter goddess
#shareman
 
  • Like
  • Love
  • Fire
Reactions: 42 users

Diogenese

Top 20
@Diogenese

Maybe the answer.

In 2020 started using VeriSilicon NPU for their latest V37 at the time.

Would presume has just been iterations of same since.

Haven't looked into the VeriSilicon capabilities yet. Time for 💤



VeriSilicon VIP9000 and ZSP are Adopted by iCatch Next Generation AI-powered Automotive Image Processing SoC

Shanghai, China, May 12, 2020 – VeriSilicon today announced that iCatch Technology, Inc. (TPEX: 6695), a global leader in low-power and intelligent image processing SoC solutions, has selected VeriSilicon VIP9000 NPU and ZSPNano DSP IP. Both will be utilized in the iCatch’s next generation AI-powered image processing SoC with embedded neural network accelerators powered by VeriSilicon’s NPU for applications such as automotive electronics, industrial, appliance, consumer electronics, AIoT, smart home, commercial and more.

VIP9000 is a highly scalable and programmable processor IP for computer vision and artificial intelligence applications and with support for all popular deep learning frameworks (TensorFlow, Pytorch, TensorFlow Lite, Caffe, Caffe2, DarkNet, ONNX, NNEF, Keras…), as well as OpenCLTM and OpenVXTM APIs. Neural network optimization techniques such as quantization, pruning, and model compression are also supported natively by VIP9000 architecture. AI applications can be easily ported to VIP9000 platforms through offline conversion by the Vivante AcuityTM SDK, or through run-time interpretation with Android NN, NN API, or ARM NN. The ZSPNano DSP provides complete programmability, industry leading power efficiency with a robust and mature SDK, enabling iCatch to extend audio and voice software applications smoothly including Acoustic Echo Cancelation (AEC), Noise Suppression and Beam Forming.

“By incorporating VeriSilicon key IP capabilities, our V37, the latest generation AI-powered image processing SoC for 4K video, can empower the edge computing capability right on the camera. Together with the embedded iCatch’s high quality image signal processing (ISP) engine and proprietary CV accelerator engine, we can drive the VeriSilicon NPU engine to the best TOPS/W performance. As a result, our AI-powered image processing SoCs not only can record high-quality video, but also perform advanced image analytics to bring intelligence to the camera devices. We successfully applied our AI-powered image processing SoCs into various applications.” said Weber Hsu, President of iCatch Technology, Inc.

“Edge AI computing has been applied in a wide range of market segments including consumer, industrial and automotive devices, especially in computer vision related applications. Image processors with built-in AI NPU have great growth momentum,” said Weijin Dai, Executive Vice President and GM of Intellectual Property Division at VeriSilicon.

“Smart eye and smart ear are essential for all the intelligent edge devices. The challenge is how to deliver the intelligence efficiently within the target product power envelope and silicon budget,” added Weijin Dai. “As a leading company in image processing, iCatch will greatly leverage our NPU and DSP in advanced solutions to perfectly address the enormous market needs. With VIP9000 and ZSPNano, iCatch’s image processing SoCs deliver advanced AI-Vision and AI-Voice capabilities at the lowest power consumptions.”
Hi Fmf,

Just brilliant research. You've ferreted out the NN, and it's CNN.

Verisilicon:

US2021382690A1 Enhanced Multiply Accumulate Device For Neural Networks


US11301214B2 Device for performing multiply/accumulate operations


CN107862378B Convolutional neural network acceleration method and system based on multiple kernels, storage medium and terminal

This explains why Prophesee was so impressed with Akida after having already committed to iCatch.
 
  • Like
  • Fire
  • Love
Reactions: 46 users
@Diogenese

Maybe the answer.

In 2020 started using VeriSilicon NPU for their latest V37 at the time.

Would presume has just been iterations of same since.

Haven't looked into the VeriSilicon capabilities yet. Time for 💤



VeriSilicon VIP9000 and ZSP are Adopted by iCatch Next Generation AI-powered Automotive Image Processing SoC

Shanghai, China, May 12, 2020 – VeriSilicon today announced that iCatch Technology, Inc. (TPEX: 6695), a global leader in low-power and intelligent image processing SoC solutions, has selected VeriSilicon VIP9000 NPU and ZSPNano DSP IP. Both will be utilized in the iCatch’s next generation AI-powered image processing SoC with embedded neural network accelerators powered by VeriSilicon’s NPU for applications such as automotive electronics, industrial, appliance, consumer electronics, AIoT, smart home, commercial and more.

VIP9000 is a highly scalable and programmable processor IP for computer vision and artificial intelligence applications and with support for all popular deep learning frameworks (TensorFlow, Pytorch, TensorFlow Lite, Caffe, Caffe2, DarkNet, ONNX, NNEF, Keras…), as well as OpenCLTM and OpenVXTM APIs. Neural network optimization techniques such as quantization, pruning, and model compression are also supported natively by VIP9000 architecture. AI applications can be easily ported to VIP9000 platforms through offline conversion by the Vivante AcuityTM SDK, or through run-time interpretation with Android NN, NN API, or ARM NN. The ZSPNano DSP provides complete programmability, industry leading power efficiency with a robust and mature SDK, enabling iCatch to extend audio and voice software applications smoothly including Acoustic Echo Cancelation (AEC), Noise Suppression and Beam Forming.

“By incorporating VeriSilicon key IP capabilities, our V37, the latest generation AI-powered image processing SoC for 4K video, can empower the edge computing capability right on the camera. Together with the embedded iCatch’s high quality image signal processing (ISP) engine and proprietary CV accelerator engine, we can drive the VeriSilicon NPU engine to the best TOPS/W performance. As a result, our AI-powered image processing SoCs not only can record high-quality video, but also perform advanced image analytics to bring intelligence to the camera devices. We successfully applied our AI-powered image processing SoCs into various applications.” said Weber Hsu, President of iCatch Technology, Inc.

“Edge AI computing has been applied in a wide range of market segments including consumer, industrial and automotive devices, especially in computer vision related applications. Image processors with built-in AI NPU have great growth momentum,” said Weijin Dai, Executive Vice President and GM of Intellectual Property Division at VeriSilicon.

“Smart eye and smart ear are essential for all the intelligent edge devices. The challenge is how to deliver the intelligence efficiently within the target product power envelope and silicon budget,” added Weijin Dai. “As a leading company in image processing, iCatch will greatly leverage our NPU and DSP in advanced solutions to perfectly address the enormous market needs. With VIP9000 and ZSPNano, iCatch’s image processing SoCs deliver advanced AI-Vision and AI-Voice capabilities at the lowest power consumptions.”
@Fullmoonfever
Mate, your research is always second to none but it seems you’ve been in overdrive the past couple of weeks! Just want to say much appreciated - always look forward to your posts.
 
  • Like
  • Love
  • Fire
Reactions: 52 users
Hi Fmf,

Just brilliant research. You've ferreted out the NN, and it's CNN.

Verisilicon:

US2021382690A1 Enhanced Multiply Accumulate Device For Neural Networks


US11301214B2 Device for performing multiply/accumulate operations


CN107862378B Convolutional neural network acceleration method and system based on multiple kernels, storage medium and terminal

This explains why Prophesee was so impressed with Akida after having already committed to iCatch.
Cheers mate.

It would definitely seem to make more sense to use neuromorphic with neuromorphic instead of the CNN.

Maybe why the guy from Prophesee (not Luca) in the video not long ago seemed a bit non committal around their best processing option...playing safe to not totally offend iCatch whilst not confirming us yet till say an iCatch contract superceded or dissolved?
 
  • Like
  • Fire
  • Love
Reactions: 29 users
@Fullmoonfever
Mate, your research is always second to none but it seems you’ve been in overdrive the past couple of weeks! Just want to say much appreciated - always look forward to your posts.
Welcome DAS.

Appreciated.
 
  • Like
Reactions: 20 users

Diogenese

Top 20
A bit more information:

Summary
For over a decade, CPU and GPU design companies have been using Synopsys VC Formal Datapath Validation (DPV) app with its HECTOR™ technology to verify their data processing elements because traditional verification methods cannot exhaustively verify the correctness of mathematical computations in these designs. Like CPUs and GPUs, AI processors are also datapath heavy with mathematical functions like addition, subtraction, matrix multiplication, and square root in its compute engines, making these designs a good fit for formal datapath validation.
This webinar will introduce the Synopsys ARC® NPX Neural Processing Unit (NPU) IP family of embedded AI processors and use of Synopsys VC Formal DPV to verify its datapath functions. The Synopsys ARC NPX6 processor supports the latest and most complex neural networks, such as CNN, RNN, and transformers, targeted for AI SoCs that are widely used in automotive, data center, high end gaming, next generation augmented reality, and surveillance. At the heart of the NPX6 neural network processor are convolution and tensor accelerators that are optimized to perform light speed computations. Correctness of these functional units are key to correct facial, audio, image processing and recognition which could have safety implications for automotive applications. We will explore Synopsys VC Formal DPV, the gold standard for datapath validation and signoff for the last 20 years, and how it is used to ensure correctness of the core algorithms of the ARC NPX6 processor’s compute engines.

Speakers
Neelabja Dutta

Neelabja Dutta
Sr. Manager, Applications Engineering
Synopsys

Shuaiyu Jiang

Shuaiyu Jiang
Sr. ASIC Digital Design Engineer
Synopsys

Long term holders of Brainchip shares will know that Vorago in the Phase 1 NASA application to harden AKD1000 described it as a CNNRNN processor.

My interest only so DYOR
FF

AKIDA BALLISTA

Synopsys have gone through the CNN wormhole in the space-time fabric and are trapped in MAC land.

US10846591B2 Configurable and programmable multi-core architecture with a specialized instruction set for embedded application based on neural networks

1671063700350.png


A programmable architecture specialized for convolutional neural networks (CNNs) processing such that different applications of CNNs may be supported by the presently disclosed method and apparatus by reprogramming the processing elements therein.


Their party trick is swapping between 16 bit and 8 bit MACs

US2021377122A1 MIXED-PRECISION NEURAL NETWORKS

1671063777798.png



[006] … The operations include receiving a target bandwidth increase for a neural network including a plurality of binary large objects (BLOBs) and weights of a first data type represented by a first number of bits. The target bandwidth increase relates to changing at least some of the plurality of BLOBs and weights to a second data type represented by a second number of bits different from the first number of bits.

[0026] Further, any suitable supervised ML techniques can be used to train the floating-point ML model. For example, the initial ML model 114 can be a neural network (e.g., a convolutional neural network (CNN), an adversarial neural network, or any other suitable neural network) and can be trained using any suitable technique. FIG. 1 is merely one example.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 22 users

Diogenese

Top 20
MF

Difficult to argue with this paragraph

"But given this alliance, other recent agreements, management’s bullish rhetoric, and its lofty market capitalisation, the market will no doubt be expecting Brainchip to start delivering some very big sales in 2023."

And it provides BRN.AX some exposure to generate investor interest.
OK admittedly trying to put some lipstick on the pig.
. ... and a very handsome pig it is too ... but why did you put the lipstick on that end?
 
  • Haha
  • Like
  • Fire
Reactions: 18 users

jtardif999

Regular
Thanks D

Like you, I had a quick patent skim and couldn't find anything to do with AI per se either.

I did notice the CNN reference on the Vi37 but then just NPU on others as I said hence the question on CNN2SNN if viable.

When you say missing link I presume you mean the AI processing component (not us in that IMX) to create that package?

I'll try dig around for earliest Prophesee/ iCatch connection mentions maybe tomoz.

Begs question maybe then of how long working with iCatch and how long working with us before either public news releases and was there any parallel Testing / Dev?
Yeah as our management have perviously described that it’s all about intersecting at the right point in their design phase to create the quickest turnaround. I would assume we will finish up as IP embedded directly in a later/next? vision sensor and we know from the Renesas experience how long it might take before we see the related product. But a licence fee should be up coming if we have intersected at the right time.
 
  • Like
  • Fire
  • Love
Reactions: 11 users

SERA2g

Founding Member
Interesting article regarding Sony.

The Sony/Prophesee/Brainchip triangle may end up being more important to Brainchip's success than any of us have ever imagined.

This aged well with the apple/Sony news that has come out this week.

A quick google indicates apple sold 240M iPhones in 2021 and are on track to do the same in 2022.

That excludes iPads and other devices.

Let’s hope we end up in next gen iPhones.
240M units per year generating $1-$1.50 royalty per unit. Yes please.
 
  • Like
  • Fire
  • Love
Reactions: 41 users

BaconLover

Founding Member
This aged well with the apple/Sony news that has come out this week.

A quick google indicates apple sold 240M iPhones in 2021 and are on track to do the same in 2022.

That excludes iPads and other devices.

Let’s hope we end up in next gen iPhones.
240M units per year generating $1-$1.50 royalty per unit. Yes please.


The Apple thread @SERA2g ... enjoy!
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

TECH

Regular
A Santa Claus rally?

I think it's already been well and truly factored in.

Looks like Santa is trying to shake our Christmas Tree one final time this year.

The decorations and lights maybe moving, but long-term shareholders, like myself, are bolted on and really enjoy a good ride.

Bring it on......2023 will be the start of the bonfire, I'll meet you at the top of Everest in January 2025 :ROFLMAO::ROFLMAO:

Love Brainchip and our future direction, northwards 😍
 
  • Like
  • Love
  • Fire
Reactions: 47 users

Damo4

Regular
A Santa Claus rally?

I think it's already been well and truly factored in.

Looks like Santa is trying to shake our Christmas Tree one final time this year.

The decorations and lights maybe moving, but long-term shareholders, like myself, are bolted on and really enjoy a good ride.

Bring it on......2023 will be the start of the bonfire, I'll meet you at the top of Everest in January 2025 :ROFLMAO::ROFLMAO:

Love Brainchip and our future direction, northwards 😍



Diamond Hands GIFs | Tenor
 
  • Haha
  • Like
  • Love
Reactions: 15 users

ndefries

Regular
Found this article very interesting about Sony

Sony is implementing Edge AI sensors into vehicles that use way less power. hmmm i have some pretty speculative opinions about the IP being used.


Sony is working on new sensors for self-driving that it claims use 70% less electricity.

The sensors would help significantly extend the range of electric vehicles with autonomous capabilities.

According to a report in Nikkei Asia, they will be made by Sony Semiconductor Solutions and be paired with software developed by Japanese start-up Tier IV.

The companies aim to deliver Level 4 tech, as defined by the Society of Automotive Engineers, by 2030. This means that the car drives itself, with no requirement for human intervention.

To achieve Level 4, autonomous vehicles (AVs) need a wide array of hardware, including sensors and cameras, that transmit massive amounts of data, requiring vast amounts of power.

Sony is hoping to reduce electricity usage via edge computing, with as much data as possible processed through artificial intelligence-equipped sensors and software on the vehicles themselves, rather than being transmitted to external networks.


This approach would potentially make AVs safer, too, by cutting communication lags.

It’s also claimed that Sony will incorporate image recognition and radar technologies into the new sensor, which would assist self-driving in rain and other adverse weather conditions.

The company currently controls around 50% of the global market for image sensors, and also has strong experience in edge computing, having commercialized technology in chips for retailers and industrial equipment.

Tier IV, meanwhile, provides open-source self-driving software. Among its partners are Taiwan consumer electronics company Foxconn, which is planning to challenge car makers with an EV platform of its own, and Japanese company Yamaha, with whom it is developing autonomous transport solutions for factories.

In recent years, Sony has become a much more visible presence in the automotive arena. In 2020, the company displayed an electric sedan concept called the VISION-S at CES in Las Vegas and at the 2022 event it revealed an SUV version, the VISION-S 02.

Earlier this year, it announced it was teaming up with automaker Honda to form a new company to build electric vehicles and “provide services for mobility,” Sony Honda Mobility Inc.

The VISION-S featured a total of 40 sensors – 18 cameras, 18 radar/ultrasonic and four lidar – suggesting automation will have a key role to play in the new company.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 87 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Better put this event in the calendar! In January 2023, at the CES, BMW will present its concept car called the "Digital Vision Vehicle". I think there's a very good chance BMW "will do a Merc" at this event. I say this because I know that BMW's “Neue Klasse” is set to "feature the next generation of Valeo’s ultrasonic sensors, the full set of surround view cameras, as well as a new multifunctional interior camera that will contribute to improved safety and create a new level of user experience.”

I can't wait!


View attachment 21902


View attachment 21900



Following on from the above, I saw this article today discussing BMW’s Facebook pages and the brand’s main Instagram account which both had their profile pictures changed on Tuesday. Here's some of what the article had to say.




BMW pm.png




 
  • Like
  • Fire
  • Love
Reactions: 66 users
Top Bottom