Sign Up | LinkedIn
500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.
To understand a bit more on what they do, here's a vid from mid 24.Think we must be losing our dot joining skills....this release beginning of Dec shows DeGirum already supporting our Akida
I also found it interesting that DeGirum use M.2 form factor with Orca and now we've said we're doing M.2
![]()
DeGirum Delivers Industry's First Hardware-Agnostic PySDK - Simplifies Edge AI Development
/PRNewswire/ -- DeGirum®, a leader in edge AI, announced the release of its hardware-agnostic PySDK, designed to streamline AI development at the edge....www.prnewswire.com
DeGirum Delivers Industry's First Hardware-Agnostic PySDK - Simplifies Edge AI Development
News provided by
DeGirum Corp.
Dec 03, 2024, 07:00 ET
Focused on Vision Processing, Industrial, Automotive, and Smart Retail Applications Development
SANTA CLARA, Calif., Dec. 3, 2024 /PRNewswire/ -- DeGirum®, a leader in edge AI, announced the release of its hardware-agnostic PySDK, designed to streamline AI development at the edge. DeGirum's PySDK provides a unified software stack that allows developers to add AI to any application with just two lines of code. PySDK supports a wide range of hardware:
Accelerators
Application Processors
- NVIDIA® GPUs
- Hailo®
- MemryX®
- Google® Edge TPU
- BrainChip®
- Intel®
- AMD®
- Texas Instruments®
- NXP®
- Rockchip®
- Raspberry Pi®
Maybe 4 ?How many podcasts has there been so far at this years ces?
This is worth watching. It’s Steve Brightfield from BrainChip talking to the Gadget Man at CES today. Starts at around the 8 min mark. Enjoy!
I promise that you will be excited by the end of it.
This is worth watching. It’s Steve Brightfield from BrainChip talking to the Gadget Man at CES today. Starts at around the 8 min mark. Enjoy!
I promise that you will be excited by the end of it.
This is worth watching. It’s Steve Brightfield from BrainChip talking to the Gadget Man at CES today. Starts at around the 8 min mark. Enjoy!
I promise that you will be excited by the end of it.
This is worth watching. It’s Steve Brightfield from BrainChip talking to the Gadget Man at CES today. Starts at around the 8 min mark. Enjoy!
I promise that you will be excited by the end of it.
Yeah and we are still potentially part of the Nanose gig: (part of claim 65 anyway)Nearly 4 years and still pending. I guess the medical trials take a very long time to conduct?
US20230152319A1 - Device and method for rapid detection of viruses - Google Patents
The invention proposes an approach utilizing novel and artificially intelligent hybrid sensor arrays with multiplexed detection capabilities for disease-specific biomarkers from the exhaled breath of a subject. The technology provides a rapid and highly accurate diagnosis in various COVID-19...patents.google.com
I liked the continuation of that where they mentioned something about a company having their own university program to retrain their staff for it.Cheer Slade.
That's is some gold nuggets
Not the exact qoute but, I like it.
'We sign more partners, where we can't mention.
Learning![]()
View attachment 75725
View attachment 75724
TENNs-PLEIADES: Building Temporal Kernels with Orthogonal Polynomials
Introducing TENNs-PLEIADES (PoLynomial Expansion in Adaptive Distributed Event-based Systems), an innovative method within the TENNs (Temporal Neural Network) family.
TENNs-PLEIADES excels in directly performing spatio-temporal classification on data from event-based sensors. These sensors offer ultra low-latency, and until now, processing their signals in a principled manner has been a significant challenge. By utilizing a unique kernel parameterization based on orthogonal polynomials, TENNs-PLEIADES achieves state-of-the-art performance across three distinct datasets: Hand Gesture Recognition, Eye Tracking, and PROPHESEE’s 1 Megapixel Automotive Detection.
TENNs-PLEIADES is fully supported on Akida 2.0 IP and represents the latest advancement in event-based and neuromorphic processing.
Absolutely no idea @Bravo well above my $2.19 pay rate , must say some of the big words look impressiveOh, thank goodness! I was wondering when we were going to get around to leveraging our orthogonal polynomials to parameterize temporal convolutional kernels.
Do you know what this is about @Tothemoon24?
I struggle with opening my Kellogg’s Coco PopsWhat?!
Do you mean to say you didn’t utilize your unique kernel parameterization for breakfast this morning? That’s very disappointing.![]()
From memory it has been on the website for quite a while as a "white Paper".
But, like you I have no idea what it all means.
I read an article about a spoon that can measure the salt content in food to support health at CES 2025.
"
Edge computing has naturally been a hot topic at CES with companies highlighting a myriad of use cases where the pre-trained edge device runs inference locally to produce the desired output, never once interacting with the cloud. The complexity of these nodes has grown to not only include multimodal support with the fusion and collaboration between sensors for context-aware devices but also multiple cores to ratchet up the compute power.
Naturally, any hardware acceleration has become desirable with embedded engineers craving solutions that ease the design and development burden. The solutions vary where many veer towards developing applications with servers in the cloud that are then virtualized or containerized to run at the edge. Ultimately, there is no one-size-fits-all solution for any edge compute application.
It is clear that support for some kind of hardware acceleration has become paramount for success in breaking into the intelligent embedded edge. Company approaches to the problem run the full gamut from hardware accelerated MCUs with abundant software support and reference code, to an embedded NPU.
...."
Can someone more knowledgeable show the data inside this link I posted? So many NPU`s... Where is our akida "gost" IP?![]()