BRN Discussion Ongoing

Tony Coles

Regular
This certainly put Brainchip's Neuromorphic in the box seat. "Beneficial AI" better for the planet.

View attachment 29371


Learning 🏖
This certainly put Brainchip's Neuromorphic in the box seat. "Beneficial AI" better for the planet.

View attachment 29371


Learning 🏖
That’s impressive @Learning, we are witnessing BRN getting mentioned in articles a little more often now, hopefully it’s just a good start for 2023. Have a great day all. 👍
 
  • Like
  • Love
  • Fire
Reactions: 18 users
Screenshot_20230213-065634.png

Another Rob telson like. Lol do people really get bothered by the likes he does on LinkedIn?
 
  • Like
  • Fire
  • Love
Reactions: 31 users

Goldphish

Emerged
  • Like
Reactions: 11 users

Deadpool

Did someone say KFC
  • Haha
  • Like
Reactions: 6 users
  • Fire
  • Like
Reactions: 2 users

Sirod69

bavarian girl ;-)
There is certainly nothing about our Akida here, but Mercedes is always happy to remind everyone of this car. I'm curious to see when we'll appear again in the reports from VISION EQXX. 🥰 The more love him the better.

Mercedes-Benz AG
Mercedes-Benz AG
5 Std.


Fusion of tech and aesthetic: The interplay of design and aerodynamics in the VISION EQXX.

Aerodynamic drag can have a big impact on range. On a regular long-distance drive, a typical electric vehicle dedicates almost two thirds of its battery capacity to cutting its way through the air ahead, which is why the VISON EQXX has an ultra-sleek and slippery drag coefficient of 0.17.

A huge amount of work went into painstakingly integrating the passive and active aerodynamic features into the external form of the VISION EQXX. The remarkable result was achieved on an impressively short timescale. The inter-disciplinary team used advanced digital modelling techniques to reach a compromise that reduces drag while retaining the sensual purity of the Mercedes-Benz design language and the practicalities of a road car.

Despite the practical challenges and the compressed timescale, the success of the collaboration is clearly evident in the sophistication and poise of the exterior design. The surfaces of the VISION EQXX run smoothly from the front, developing powerful yet sensual shoulders above the rear wheel arches. This natural flow concludes with a cleanly defined, aerodynamically effective tear-off edge accentuated by a gloss-black end trim, punctuated by the rear light clusters.

Learn more about the VISION EQXX: https://lnkd.in/dtddQ8rK
 
  • Like
  • Love
  • Thinking
Reactions: 11 users

jtardif999

Regular
Renesas full year report came out on Feb 9. They had some explaining to do!

Notice Concerning the Difference between Financial Results for the Year Ended December 31, 2022 and Results in the Previous Period

February 09, 2023 01:00 AM Eastern Standard Time
TOKYO--(BUSINESS WIRE)--Renesas Electronics Corporation (TSE: 6723, “Renesas”), a premier supplier of advanced semiconductor solutions, today announced the difference between its consolidated financial results for the year ended December 31, 2022 (January 1, 2022 to December 31, 2022), which it disclosed on February 9, 2023, and the financial results in the previous period (January 1, 2021 to December 31, 2021).
The forecasts for the above period are not based on IFRS, therefore the differences are shown as the actual figures.

1. Difference between consolidated financial results for the year ended December 31, 2022 and the year ended December 31, 2021
In millions of yen
RevenueOperating ProfitProfit before taxProfitProfit attributable to owners of parent
Year ended
December 31, 2021
993,908173,827142,718119,687119,536
Year ended
December 31, 2022
1,500,853424,170362,299256,787256,632
Difference506,945250,343219,581137,100137,096
Difference (%)51.0%144.0%153.9%114.5%114.7%

2. Background to the difference
Consolidated revenue for the year ended December 31, 2022 was 1,500.9 billion yen, a 51.0% increase year on year. This was mainly due to a sales increase effect from the consolidation of Dialog Semiconductor Plc acquired on August 31, 2021 and yen depreciation, in addition to an increase in revenue in the Automotive Business supported by continued growth in semiconductor contents per vehicle as well as an increase in revenue in the Industrial/Infrastructure/IoT Business from demand expansion in the infrastructure market such as datacenters.
The gross profit increased from improvements in product mix in addition to growth in revenue. In addition, the operating profit, profit before tax, profit, and profit attributable to owners of parent for the year ended December 31, 2022 significantly exceeded the results from the previous period, supported by the company’s efforts to streamline business operations.
 
  • Like
  • Fire
  • Wow
Reactions: 28 users

stuart888

Regular
This guy is a 20 year programmer, hard core, deep learning engineer. Sharp guy, good videos.

He says he now uses ChatGBT to do 80% of his coding now. That is disruptive. He supervises the code and learns from it.

Thought it was interesting. Plus he summarizes ChatGBT and Transformer Neural Networks, "Attention is all you need". Famous Google article as they created Transformers in 2017.

I feel Brainchip is going to win in TinyML, as you don't need big data to train edge AI on Industrial patterns, ultra low power.

https://hub.packtpub.com/paper-in-two-minutes-attention-is-all-you-need/

 
  • Like
  • Fire
  • Wow
Reactions: 24 users

stuart888

Regular
Australian Space TV, he is a Neural-Spiker! The first published (non-secret) SNN application in space, I believe first started here.

Good video, yeah Australia! :coffee::coffee::coffee:

Same thing, you don't need big data to train SNN edge AI on Industrial patterns, ultra-low power.

1676240209103.png


 
  • Like
  • Fire
Reactions: 10 users

stuart888

Regular
Really glad we have Transformers high on the Brainchip radar. This stuff is too deep, for me, I really don't need to know all the Fancy Math Equations.

The transformer math takes time to unpack. Forget about that part.

When dealing with text/speaking: the human language has lots of double meaning words, that in certain parts of the sentence mean one thing. Yet the same word, with the same spelling can mean something different later in the same paragraph.

This solves that, so ChatGBT is like a teenage, that can be further trained in specific industries, say to automatically print out the legal paperwork documents required for signing when buying a house.

I would skip it, just know "Attention is all you need".

 
  • Like
Reactions: 8 users

stuart888

Regular
Explaining all the various AI Accelerators

This particular video is good for the non-spiking-engineer, like me. He does an overview of the AI progress at the chip level, exceptional.

This High Yield guy knows hardware, SoC boards, etc.



1676241681864.png
 
  • Like
Reactions: 5 users

TheFunkMachine

seeds have the potential to become trees.
Hi D,

Thanks, I must have missed that SBIR. That explains why there's been talk about not needing a processor.
The timing of the SBIR and the annoucement of our 22nm AKD1500 GF reference chip does align very nicely.


For anyone else that missed it:

Release Date:
January 10, 2023
Open Date:
January 10, 2023
Application Due Date:
March 13, 2023
Close Date:
March 13, 2023 (closing in 29 days

The preference is for a prototype processor fabricated in a technology node suitable for the space environment, such as 22-nm FDSOI, which has become increasingly affordable.


Neuromorphic and deep neural net software for point applications has become widespread and is state of the art. Integrated solutions that achieve space-relevant mission capabilities with high throughput and energy efficiency is a critical gap. For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.
“For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.”

Am I reading this wrong? Are they saying that Akida needs a full host processor for integration for their SDK? I thought one of the main selling points of Akida is that it doesn’t need a host processor and can do all the computation alone?
 
  • Like
  • Thinking
Reactions: 2 users

Dhm

Regular
Australian Space TV, he is a Neural-Spiker! The first published (non-secret) SNN application in space, I believe first started here.

Good video, yeah Australia! :coffee::coffee::coffee:

Same thing, you don't need big data to train SNN edge AI on Industrial patterns, ultra-low power.

View attachment 29376


I wrote to Greg following publicity surrounding Jarli in August last year. At the time they weren’t using Akida. The question now is have they had a chance to explore and implement Akida in their most recent projects.

8289B464-3289-4ACC-8C98-A2DA320BBEBE.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 37 users

stuart888

Regular
Thanks for the detail @Dhm! He really just highlighted all that Brainchip is going after. Neuromorphic Event Based Cameras and Spiking Smarts blows away the old fps (send me all the pixels stuff). They are hiring all things neuromorphic.

It just cements the Brainchip overall thesis. They are in the perfect position to win. Mr Chapman said it well earlier, too many dots.

Love hearing it from a person like this, preaching the Brainchip philosophy/focus.

1676242884391.png
 
  • Like
  • Fire
Reactions: 10 users

Calsco

Regular
Again we are just sitting at resistance. Here’s hoping we have some positive news this week as the share price is ready to bounce up, we really are just sitting on the launch pad we just need a spark to start the engines!
AC95F7F9-8AE6-4FCA-8FBA-CBB5850DD352.jpeg
 
  • Like
  • Fire
  • Haha
Reactions: 18 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
“For example, terrestrial neuromorphic processors such as Intels Cooporation's LoihiTM, Brainchip's AkidaTM, and Google Inc's Tensor Processing Unit (TPUTM) require full host processors for integration for their software development kit (SDK) that are power hungry or limit throughput. This by itself is inhibiting the use of neuromorphic processors for low SWaP space missions.”

Am I reading this wrong? Are they saying that Akida needs a full host processor for integration for their SDK? I thought one of the main selling points of Akida is that it doesn’t need a host processor and can do all the computation alone?

Hi Funky, what about SiFive + BrainChip sitting in a tree, K I S S I N G?😚


Extract 1
14.png




Extract 2
15,.png
 
  • Like
  • Love
  • Fire
Reactions: 31 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Wow
  • Like
  • Love
Reactions: 21 users

stuart888

Regular
Hi Funky, what about SiFive + BrainChip sitting in a tree, K I S S I N G?😚


Extract 1
View attachment 29384



Extract 2
View attachment 29385
Sitting in a tree, watching the Superbowl. Baby Face the old music group just did a bit live. Fantastic.

This is the old hit.

Personally: Brainchip is a super bet. No stock is perfect way early, when you don't know. I am fine with all things Brainchip.

 
  • Like
Reactions: 5 users

jk6199

Regular
How about some of the huge transactions on the market so far.

Is that distant drums I hear?
 
  • Like
  • Fire
  • Love
Reactions: 6 users

stuart888

Regular
While I enjoy the Neuromorphic love, these videos with the computer-generated text/voice seem like ChatGBT are a bit poor.

Perhaps over time, a person could trust the source of who produces the video content.

I am kind of turned-off a bit to the stuff that seems artificial. This will be a challenge. Proving accuracy will be important. A new benchmark, truth or 97% truth.

AI in the cars to save lives, no brainer. They can outperform us, and it gets proved first long before level 4. Accident vs miles driven, AI is way better than humans already.

 
  • Like
  • Wow
Reactions: 7 users
Top Bottom