BRN Discussion Ongoing

I would like to see them add MIT or Caltech to the university list but I guess it is just a matter of time ;)
 

Attachments

  • D043CD1C-041A-4489-A11B-64460C536426.jpeg
    D043CD1C-041A-4489-A11B-64460C536426.jpeg
    55.7 KB · Views: 162
  • Like
  • Love
  • Fire
Reactions: 11 users

Boab

I wish I could paint like Vincent
All of a sudden there are more buyers than sellers
 
  • Like
  • Fire
Reactions: 7 users

Esq.111

Fascinatingly Intuitive.
Afternoon Smoothsailing ,

Nice find ... CVEDIA just happen to be a confirmed engagement of BrainChip ( 23 /5/2023 ) , & No. 29 on THE BRAINCHIP SCROLL.

Regards ,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 22 users
Amazing ..., And just like that a little whale meanders through and hoovers up the small bait ball created .

Who would have thought.
What happened to the parrots, Esq? 🤔
 
  • Like
Reactions: 2 users
As I said before, when the price went DOWN the other day 35%, I BROUGHT.
I don't think you can have more faith than that, @Smoothsailing.
Brilliant
 
  • Like
Reactions: 3 users

Esq.111

Fascinatingly Intuitive.
Afternoon DingoBorat ,

They are still about , though on strike , apparently one of Blackrocks head trading parrots went rouge , Drained their currency trading account of Swiss Franks then went all in on the Chicago future's .. bought some 1.2 million bushels of AAA Grade birdseed which was delivered via the company's private jet
to southern Panama somewhere , and since vanished...... Needles to say Larrys not happy.

:whistle:.

Last known photo of Escobar ( Ex head trader Blackrock ) , before he went propper awole.

1710216831349.png

Regards ,
Esq.
 
  • Haha
  • Like
  • Fire
Reactions: 27 users

hotty4040

Regular
Oh Buddy, have you got it ALL WRONG!!!,
"that isn't doing well today?"
I don't need anybody to ask for my decision to buy or sell.
Did I sell when it went down 35% the other day. NO I DIDN'T, I brought more, DID YOU?
What a PATHETIC reply to my question,
Thought I would get a more reasonable reply from you ,Mr Rob .unt


Now, now, let's not get toooo overly exuberent. It was only an opinion, when all said and done. Now that wasn't a very nice or conciliatory reply at all, now, was it. There's absolutely no reason for you to respond that way. Do you not think that the label "pathetic" might also apply to your posting as well ? ... Very childish IMHO.

And also, you didn't " brought more " you in actual fact ( BOUGHT MORE ) ok........

I mean, where has the respect these days gone from some of us I wonder at times.

Your attitude needs some, examination, don't you think..


Akida Ballista comrades, >>>>> Not long now <<<<<

hotty...
 
  • Like
  • Love
  • Thinking
Reactions: 11 users

Diogenese

Top 20
All of a sudden there are more buyers than sellers
21 million sold down - 2 million bought up - that's an awful lot of sprats for a very modestly sized mackerel.
 
  • Like
  • Haha
  • Fire
Reactions: 17 users
An interesting article about how difficult it is for AI startups to recruit AI talent:


"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs'," Srinivas said on a recent episode of the advice podcast "Invest Like The Best."

"That would cost billions and take 5 to 10 years to get from Nvidia," Srinivas said."

"The CEO added that even if smaller firms like Perplexity are finally able to get Nvidia's chips, they'll continue to fall behind because of AI's rapid speed of development.

That could make it even harder to secure AI talent in the future.

"By the time you waited and got the money and booked the cluster and got it, the guys working here will have already made the next-generation model," Srinivas said, referring to AI talent at major tech companies."
 
  • Like
  • Wow
Reactions: 10 users

skutza

Regular
Hey Skutza, great advice, except I'm not the one who needs an audience.

I did take it private, he didn't respond to me there, instead he screenshotted my private messages, and posted them on the forum.

In fact, why didn't you post this piece of advice to me privately 🤔..
LOL, I kinda thought that myself when I read it back. Kind of hypocrisy right?
 
  • Like
Reactions: 3 users

mrgds

Regular
An interesting article about how difficult it is for AI startups to recruit AI talent:


"I tried to hire a very senior researcher from Meta, and you know what they said? 'Come back to me when you have 10,000 H100 GPUs'," Srinivas said on a recent episode of the advice podcast "Invest Like The Best."

"That would cost billions and take 5 to 10 years to get from Nvidia," Srinivas said."

"The CEO added that even if smaller firms like Perplexity are finally able to get Nvidia's chips, they'll continue to fall behind because of AI's rapid speed of development.

That could make it even harder to secure AI talent in the future.

"By the time you waited and got the money and booked the cluster and got it, the guys working here will have already made the next-generation model," Srinivas said, referring to AI talent at major tech companies."
Sounds just like the Australian Defense Systemn. :rolleyes:

Anyway, nice volume, beatdown, then recovery to finish GREEN TODAY (y)

Time for a run @Bravo
Screenshot (81).png




AKIDA BALLISTA
 
  • Like
  • Haha
  • Fire
Reactions: 9 users

MDhere

Top 20
Your first prediction was right @Esq.111 Sunny day, no clouds as we are -
Screenshot_20240312-154337_Google.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 12 users

IloveLamp

Top 20
🤔

1000014084.jpg
 
  • Like
  • Thinking
  • Fire
Reactions: 5 users

Adam

Regular
Very special people used that toilet . Look at the width of the toilet paper .
Duh! And it's not even ironed! Methinks the SQL DB crapped itself..Select * from Downrampers > 2
 
  • Haha
Reactions: 2 users

1710231723875.png



I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 90 users

View attachment 58928


I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Brilliant
 
  • Like
Reactions: 12 users

View attachment 58928


I received many great questions from the community in response to my recent post on neuromorphic computing, so I’ll jump right in and answer a few.

How does a more powerful processor increase energy efficiency?

#AI is already used in advanced driving assistance systems (ADAS) and infotainment, and the complex calculations are currently performed on traditional CPUs, GPUs and NPUs, which are not energy efficient. #Neuromorphiccomputing requires for the same tasks less energy. As the number of AI functions continue to increase, the increased computing efficiency of neuromorphic hardware will require less energy in comparison to legacy hardware. Reduced energy usage will also increase vehicle range and improve sustainability.

When can I experience neuromorphic computing?

Widespread use of neuromorphic computing will depend on many factors. The technology requires new programming and algorithms, so it will not immediately replace traditional processors. One key factor for us is that automotive-grade chips must meet extremely strict reliability requirements. However, we are already actively working to drive development and we are committed to being the first to use this technology in the automotive industry.

If you haven’t read the article yet, check it out here https://lnkd.in/epnUc5Sy. Be sure to ask more questions so we can keep the conversation going.



Neuromorphic computing? We’ve got that. 😎

Because it’s still nascent technology, I am frequently asked to describe #neuromorphic computing. It is a paradigm shift for how we perform computations in machine learning (#ML) and artificial intelligence (AI), which process massive amounts of data requiring tons of fast memory.

Currently available processor architecture separates data calculations from system memory, which is inefficient. The biological inspiration for neural networks is the human brain, where computing and memory are combined, and data processing uses neurons to communicate through electrical signals and chemical processes known as neurotransmitters.

In neuromorphic computing, those human neurons and synapses are modelled in circuits and communication is event-driven, with information coded in spikes, mimicking the processing fundamentals of the brain. Those spikes propagate through a Spiking Neural Network of artificial neurons and synapses to predict results. Information processing is measured by spike rate or spike time instead of the number of calculations. Thus, neuromorphic chips are more energy efficient and have lower latency than conventional CPUs and GPUs. That means much faster computation using considerably less power.

However, this change in data processing also requires new software algorithms specifically designed to work with neuromorphic hardware. Existing algorithms can only partially leverage the many benefits of neural technology. Thanks to Valerij, Alexander, Christina in the Innovations & Future Technology area and the rest of our team for tackling this huge project!

𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀

Neuromorphic computing reduces the power required for advanced AI computation, which is useful in applications where energy is limited, like electric vehicles. However, we still need automotive-grade chips with neuromorphic technology before this technology becomes common in cars.

We at Mercedes-Benz AG are currently working on novel algorithms that take advantage of neuromorphic computing to improve the energy efficiency and performance of our cars. Our primary goals are to extend vehicle range, make safety systems react faster, and increase the number of #AI functions possible. In 2020, we already joined the #Intel Neuromorphic Research Community and since then we are continuously expanding our collaborations with other research partners and universities to ensure our software and hardware solutions continue to lead the industry.

It's an exciting time to be in the world of automotive technology. Please share any questions and comments below.
Nice SG.

Toooooo scared to mention Akida in case the SP blows up again :ROFLMAO::cry:

Wasn't sure whether to laugh or cry.

Being part of the INRC, here's a thought.

Given we are not a brand name in the wider scheme of things and Intel are for consumers (todays world & consumer is sadly aligned to brands), wonder if our association with IFS is a designed pathway that could allow MB to us via an Intel name eventually and to complete the requisite testing and certification of the chips in due course.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 44 users

Iseki

Regular
I know we are all waiting with baited breath for Mercedes-Benz to announce their third chip partner. And good for M-B when they do.
What beats me is that shouldn't we also be beating a path to Boeing? Haven't they just experienced a couple of years of catastrophic events that has destroyed much of their credibility? Events that would have been avoided by the correct interpretation of sensor data.

Who here would willingly fly on a Boeing 737-8 Max? Not many, I would think.

Surely Boeing would be open to talk about a Boeing.OS built around sensor fusion, where sensor data can be uniformly combined in a way that each sensor data is contextualized amongst the network of other sensor data to give the all clear or a red flag - eg yes there can be vibration at take off; no the window should not be open at 33000' etc. This fusion is meant to be something that SSN systems are good at - you can add more and more sensors to the jet's fusion network in a stable fashion.

Surely with the way things are going at Boeing this would have some interest. Surely with the work we are doing with M-B, if it is proceeding, it would be relevant, and Boeing need to spend up now to save their reputation and future sales.
 
  • Like
  • Fire
Reactions: 8 users

suss

Regular
Nice SG.

Toooooo scared to mention Akida in case the SP blows up again :ROFLMAO::cry:

Wasn't sure whether to laugh or cry.

Being part of the INRC, here's a thought.

Given we are not a brand name in the wider scheme of things and Intel are for consumers (todays world & consumer is sadly aligned to brands), wonder if our association with IFS is a designed pathway to allow MB to us via an Intel name eventually and to complete the requisite testing and certification of the chips in due course.
Sean mentioned the Mercedes NDA in that interview last week.
Exciting development ahead, it's all good!
 
  • Like
  • Love
  • Fire
Reactions: 29 users

Pepsin

Regular
  • Like
  • Thinking
  • Wow
Reactions: 19 users
Top Bottom