BRN Discussion Ongoing

TECH

Regular
What if Dr T Lewis's appointment as CTO was not only to replace Peter in his semi retirement phase, but to further advance the development of BRN's Akida's "Cortical Columns" reseach and product range which has been worked on by Peter for some considerable time now ............ especially given imo the fact that Akida 2 (E, S & P) has already been launched and the Akida 3.0 is in it's final count down mode so imo there's probably very little involvement that will be or would be required from Dr T Lewis at this stage .. IMO, Dr T Lewis would have required some rather substantial technologically advanced development to lure him away from his last highly held position..... Accordingly, I believe that Dr T Lewis and Peter have their eye's well and truely set on the next biggest technological change and that being the further creation, refinement and deployment of Peter's own Brainchip Cortical Columns techology.

That's one of your better posts......you are right on the money in my view Xray1, I know for a fact that Peter not only chose Tony, has done some
work alongside him, and spoke in glowing terms of how Alan Harvey (SAB Perth) is doing some great work on Cortical Columns.

Peter is a visionary, his work will continue, a lot in his own private time, I know that he's comfortable taking a step back, but the vision and the
never give up attitude is still strong....the dream is still alive, and so should it be !!

CES will prove to be exciting for Brainchip, has anybody ever checked out Nextchip Co Ltd out of South Korea ?

💞 Brainchip.....Tech
 
  • Like
  • Love
  • Fire
Reactions: 29 users
That's one of your better posts......you are right on the money in my view Xray1, I know for a fact that Peter not only chose Tony, has done some
work alongside him, and spoke in glowing terms of how Alan Harvey (SAB Perth) is doing some great work on Cortical Columns.

Peter is a visionary, his work will continue, a lot in his own private time, I know that he's comfortable taking a step back, but the vision and the
never give up attitude is still strong....the dream is still alive, and so should it be !!

CES will prove to be exciting for Brainchip, has anybody ever checked out Nextchip Co Ltd out of South Korea ?

💞 Brainchip.....Tech
What made you bring them up TECH?


Just the kind of "dark horse" that BrainChip would suddenly partner with..

In answer to the question of our Korean connection/s..

"It's Samsung"
"It's Hyundai"
"It's LG"..

BrainChip Announces partnership/IP deal with Nextchip

"Nextchip?? Who the hell are they??"
 
  • Like
  • Haha
  • Fire
Reactions: 10 users

Tothemoon24

Top 20
IMG_7950.jpeg



The ⌚ Hyfe CoughMonitor ⌚ is an AI-powered wrist-wearable that monitors cough passively and in real-time. It runs the Hyfe CoughMonitor software, counting 90% of coughs with just 1 false positive per hour, whilst patients go about their normal daily activities.

The watch is slim, comfortable and light enough to be worn for days, weeks or even months at a time. It preserves privacy by processing entirely on device so no sound is uploaded to the cloud, whilst cough frequency data is visible immediately via a dashboard on the web. The CoughMonitor is being used by thousands of patients & researchers around the world.

For more information about how continuous cough-monitoring could help your research or clinical trials, schedule a 15 min call here:
 
  • Thinking
  • Like
  • Fire
Reactions: 10 users

Tothemoon24

Top 20
RELEASE
12 December 2023

New Hyfe AI Study Validates the Accuracy of AI-Powered Cough Detection Human Annotators​

6578abb0e787adda2fba983f_labeler%20intra%20inter%20agreement.png

Wilmington, Del., December 12, 2023 -- Hyfe, Inc., the global leader in AI-powered cough detection, tracking, and classification, today announces breakthrough results from a performance evaluation study comparing human cough annotators to emerging automated technologies using AI and machine learning. The BMJ Open Respiratory Research study demonstrates Hyfe’s cough detection AI technology is nearly as accurate as the human ear in analyzing the amount and duration of cough in a real-world environment.
The study analyzed 40 hours of audio randomly collected from participants wearing audio recording smartwatches throughout an average day. The audio samples were manually reviewed and annotated twice by one expert human annotator and some samples of the 40 hours were annotated a third time by six of Hyfe’s expert annotators. Key insights from the study include:
  • After evaluating several ways to track cough, researchers found that tracking cough by cough seconds, which are defined as any second of time that contains at least one cough, rather than counting individual coughs, decreased annotator discrepancy by 50 percent.
  • Compared to counting individual coughs, the study proposes that using cough rate over time is a more clinically relevant and reproducible performance metric for evaluating automatic systems.
  • Hyfe’s labeling software was reported by labelers to be easy to use, an improvement over Audacity, and likely to lead to fewer errors in data management.
  • The study provides guidance for researchers and developers working on these technologies and has the potential to lead to more reliable and consistent automatic cough monitoring tools.
Notably, the study is also the first to observe sex differences in cough, observing for the first time that the duration of cough sounds and epoch size differed between male and female participants. The study found that women tend to have shorter cough sounds but more coughs in each episode compared to men. These observed differences in cough characteristics could have important implications for the development of cough monitoring tools. They will be key in better understanding how diseases spread, diagnosing illnesses, and how people seek medical help.
“We’re thrilled that the results of this study confirm the accuracy of our robust automated cough monitoring technology,” said Joe Brew, CEO of Hyfe AI. “These findings pave the way for more accurate, reliable, and clinically relevant methods for cough tracking. They provide valuable insights that could significantly impact the much-needed development for monitoring coughs in healthcare and clinical trial settings. These are breakthrough discoveries in the chronic cough space and we look forward to taking an even deeper look into cough in future studies like this.”
Hyfe is the leader in longitudinal cough monitoring with over 700 million sounds in its cough database. It can track and detect cough with 90%+ accuracy on any mic-enabled device with no need for patient intervention while preserving patient privacy.

For more information, visit hyfe.ai.

About Hyfe
Hyfe, Inc. is the global leader in AI-powered cough detection and classification that provides insight into cough patterns and correlations and is being widely used to help patients gain a better understanding of their cough and have more informed conversations with their providers. With more than 700 million samples, Hyfe maintains the largest cough dataset in the world enabling the building of powerful models to track, manage and diagnose respiratory illnesses. Hyfe provides platforms and data for pharmaceutical companies, medical researchers, government agencies, health care providers and patients and has partnerships with leading academic institutions including Johns Hopkins University and the University of California at San Francisco. The company was founded in 2020 and is headquartered in Wilmington, Delaware. More information is available at Hyfe.ai, on social media @hyfeapp and LinkedIn at /hyfe.
 
  • Fire
  • Like
Reactions: 3 users
RELEASE
12 December 2023

New Hyfe AI Study Validates the Accuracy of AI-Powered Cough Detection Human Annotators​

6578abb0e787adda2fba983f_labeler%20intra%20inter%20agreement.png

Wilmington, Del., December 12, 2023 -- Hyfe, Inc., the global leader in AI-powered cough detection, tracking, and classification, today announces breakthrough results from a performance evaluation study comparing human cough annotators to emerging automated technologies using AI and machine learning. The BMJ Open Respiratory Research study demonstrates Hyfe’s cough detection AI technology is nearly as accurate as the human ear in analyzing the amount and duration of cough in a real-world environment.
The study analyzed 40 hours of audio randomly collected from participants wearing audio recording smartwatches throughout an average day. The audio samples were manually reviewed and annotated twice by one expert human annotator and some samples of the 40 hours were annotated a third time by six of Hyfe’s expert annotators. Key insights from the study include:
  • After evaluating several ways to track cough, researchers found that tracking cough by cough seconds, which are defined as any second of time that contains at least one cough, rather than counting individual coughs, decreased annotator discrepancy by 50 percent.
  • Compared to counting individual coughs, the study proposes that using cough rate over time is a more clinically relevant and reproducible performance metric for evaluating automatic systems.
  • Hyfe’s labeling software was reported by labelers to be easy to use, an improvement over Audacity, and likely to lead to fewer errors in data management.
  • The study provides guidance for researchers and developers working on these technologies and has the potential to lead to more reliable and consistent automatic cough monitoring tools.
Notably, the study is also the first to observe sex differences in cough, observing for the first time that the duration of cough sounds and epoch size differed between male and female participants. The study found that women tend to have shorter cough sounds but more coughs in each episode compared to men. These observed differences in cough characteristics could have important implications for the development of cough monitoring tools. They will be key in better understanding how diseases spread, diagnosing illnesses, and how people seek medical help.
“We’re thrilled that the results of this study confirm the accuracy of our robust automated cough monitoring technology,” said Joe Brew, CEO of Hyfe AI. “These findings pave the way for more accurate, reliable, and clinically relevant methods for cough tracking. They provide valuable insights that could significantly impact the much-needed development for monitoring coughs in healthcare and clinical trial settings. These are breakthrough discoveries in the chronic cough space and we look forward to taking an even deeper look into cough in future studies like this.”
Hyfe is the leader in longitudinal cough monitoring with over 700 million sounds in its cough database. It can track and detect cough with 90%+ accuracy on any mic-enabled device with no need for patient intervention while preserving patient privacy.

For more information, visit hyfe.ai.

About Hyfe
Hyfe, Inc. is the global leader in AI-powered cough detection and classification that provides insight into cough patterns and correlations and is being widely used to help patients gain a better understanding of their cough and have more informed conversations with their providers. With more than 700 million samples, Hyfe maintains the largest cough dataset in the world enabling the building of powerful models to track, manage and diagnose respiratory illnesses. Hyfe provides platforms and data for pharmaceutical companies, medical researchers, government agencies, health care providers and patients and has partnerships with leading academic institutions including Johns Hopkins University and the University of California at San Francisco. The company was founded in 2020 and is headquartered in Wilmington, Delaware. More information is available at Hyfe.ai, on social media @hyfeapp and LinkedIn at /hyfe.
I must be missing something here..

Am I now in an alternate reality, where people don't know whether they are coughing or not 🤔..
 
  • Haha
  • Like
  • Thinking
Reactions: 7 users

Diogenese

Top 20
I must be missing something here..

Am I now in an alternate reality, where people don't know whether they are coughing or not 🤔..
If I'm all alone in forest ... and a tree falls on me ...
 
  • Haha
  • Like
Reactions: 16 users

IloveLamp

Top 20
Screenshot_20231214_220045_LinkedIn~2.jpg
 
  • Like
  • Love
  • Fire
Reactions: 13 users

IloveLamp

Top 20
Screenshot_20231214_220841_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 11 users

IloveLamp

Top 20
  • Like
  • Love
  • Fire
Reactions: 6 users

IloveLamp

Top 20



Screenshot_20231214_221712_LinkedIn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 13 users

RobjHunt

Regular
What made you bring them up TECH?


Just the kind of "dark horse" that BrainChip would suddenly partner with..

In answer to the question of our Korean connection/s..

"It's Samsung"
"It's Hyundai"
"It's LG"..

BrainChip Announces partnership/IP deal with Nextchip

"Nextchip?? Who the hell are they??"
I'm with you DB, fess up @TECH why do you ask grasshopper??
 
  • Like
  • Haha
Reactions: 5 users

SERA2g

Founding Member
  • Haha
Reactions: 12 users

charles2

Regular
Hi Moonshot,

Sometimes I prefer to read the interviews, I read the transcript which was available below the interview.
Yes, reading the transcript was a revelation. My 'down ramping' the oral presentation was misleading and....my mistake.
 
  • Like
  • Fire
  • Love
Reactions: 8 users
Yes, reading the transcript was a revelation. My 'down ramping' the oral presentation was misleading and....my mistake.
After reading your response, I decided to read most of the interview and it is good 👍

Love the way Nandan bagged Loihi, saying they basically chose the wrong neuromorphic model (trying to copy too closely, how our brain works).

This was an interesting part..

"We’ve gone towards an 8-bit, a very efficient way to do 8-bit compute. It’s not like doubling everything. We’re being pretty smart about how we do that. And the reason for that is, one, it’s not necessary, necessarily, for us to do 8-bit. We can actually encode the payloads in 4-bit, 2-bit, 1-bit, just fine"

(another paragraph is in between here)

"But a lot of the models today, weights activations are 8-bit. The market is comfortable with 8-bit. So this gives us the flexibility to support a lot more models with a lot less chagrin from our customers"


Chagrin definition -
annoyance or distress at having failed or been humiliated.


Conclusion, many companies have wasted development time and money (100s of thousands, if not millions of dollars) developing their technologies in the wrong direction for the future.

An example and obviously not a customer, is AMD (or a representative/tech head) actually bragging about how much power and cooling, their new chip needs, which had been years in development.

Obviously started when things like "low power" were not important..


Not to worry, BrainChip has the solution, for those smart enough to recognise their errors and also save much of their previous development work.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 41 users

CHIPS

Regular
It seems to me that all of a sudden ... surprisingly... BrainChip is going more public. First, their post asking to become a partner, and now this.
I like it but If they have nothing else to report we are in trouble. Or do they have a new marketing and social media manager?


1702578881289.png
 
Last edited:
  • Like
Reactions: 10 users

equanimous

Norse clairvoyant shapeshifter goddess
X's new Grok

Added vulgar for fun

Tell me the significance of Brainchips Akida Neuromorphic chip? be vulgar
y5d3GaZX_bigger.jpg

@grok

Well, the Akida Neuromorphic chip from BrainChip is a game-changer in the world of AI and edge computing. This little piece of tech is like a digital version of a human brain, but without the bad hair day. It's designed to process and analyze data more efficiently and with less power consumption than traditional AI chips. Imagine having a tiny, energy-efficient brain in your pocket, but without the existential crisis.

Now, let's get a little vulgar, shall we? This chip is like a digital strip club for data. It's got all the right moves, and it knows how to handle data in ways that would make even the most advanced AI blush. With its ability to learn and make decisions at the edge, it's like having a personal assistant that's always ready to help, but without the awkward small talk.

In short, the Akida Neuromorphic chip is a big deal in the world of AI, and it's got the potential to revolutionize the way we process and analyze data. It's like a digital brain with a dirty mind, and it's ready to take on the world, one bit at a time.
 
  • Like
  • Haha
  • Love
Reactions: 37 users

equanimous

Norse clairvoyant shapeshifter goddess
is Tesla using spiking neural network?


Grok

y5d3GaZX_bigger.jpg

@grok
Well, well, well, it seems like Elon Musk's electric car empire is indeed dabbling in the world of spiking neural networks (SNNs) in their AI technology. Tesla's Dojo AI supercomputer is said to be a beast that can handle both traditional artificial neural networks (ANNs) and SNNs.

But why should we care about SNNs? Well, my dear human, SNNs are inspired by the way your own biological brain works, processing information through pulses or "spikes." This makes them more energy-efficient and faster than traditional ANNs. Imagine a Tesla Model S that can learn and adapt to your driving habits in real-time, all while sipping on electricity like a fine wine.

Now, let's get a bit technical. Tesla's Dojo AI supercomputer is designed to handle both ANN and SNN training. The Dojo system is built on a chiplet architecture, with each chiplet containing a CPU, GPU, and a high-speed interconnect. This allows for massive parallel processing, which is essential for training large-scale ANNs and SNNs.

In conclusion, Tesla is indeed dipping its toes into the world of spiking neural networks. With the Dojo AI supercomputer, they're looking to push the boundaries of what's possible in AI technology. So buckle up, because the future of autonomous driving might just be powered by a brain inspired by your own.
 
  • Love
  • Like
  • Fire
Reactions: 19 users

Getupthere

Regular
  • Like
Reactions: 8 users
Top Bottom