BRN Discussion Ongoing

cosors

šŸ‘€
Last edited:
  • Haha
  • Like
  • Wow
Reactions: 10 users

cassip

Regular
Update SP from Germany (afternoon):

Concerning SP one can say: "the squirrel feeds laboriously" - "mühsam ernährt sich das Eichhörnchen" (German proverb)

nevertheless: SP at 16:11 pm up 1,63 % means € 0,4498 (AUS $ 0,6985), volume went up to 43k (at Tradegate; Stuttgart, Frankfurt, Munich etc. some few parcels).

Maybe German Angst was triggered by one sentence in Markus SchƤfers linkedin article after mentioning BRN and Intel: "So, you see, despite impressive advances, there is still a very long way to go."
 
  • Like
  • Fire
  • Love
Reactions: 16 users

Sirod69

bavarian girl ;-)
Edge Impulse
Edge Impulse30.315 Follower:innen
1 Std. • vor 1 Stunde
Apple has unveiled its M2 Pro and M2 Max SoCs, featuring a more powerful CPU and GPU, new Neural Engine, next-generation image signal processor, and what the company claims is ā€œindustry-leadingā€ power efficiency

only interesting, because of Edge Impuls and Apple, I think


 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 27 users

Bravo

If ARM was an arm, BRN would be its bicepsšŸ’Ŗ!
Update SP from Germany (afternoon):

Concerning SP one can say: "the squirrel feeds laboriously" - "mühsam ernährt sich das Eichhörnchen" (German proverb)

nevertheless: SP at 16:11 pm up 1,63 % means € 0,4498 (AUS $ 0,6985), volume went up to 43k (at Tradegate; Stuttgart, Frankfurt, Munich etc. some few parcels).

Maybe German Angst was triggered by one sentence in Markus SchƤfers linkedin article after mentioning BRN and Intel: "So, you see, despite impressive advances, there is still a very long way to go."
Hi @cassip, it’s a pity if that’s the case. I thought the point that Markus was trying to make was there’s still a long way to go to making an actual ā€œbrain on a chipā€, meaning a chip that has exactly the same functioning power as the human brain which has 80-100 billion neurons. That’s a very high bar to set and who knows if it will ever be achieved.

But he does go on to state that advancements such as those being made by BrainChip and Intel (1 million neurons) are ā€œimpressiveā€ and he elaborates by saying ā€œThe thing is, even a tiny fraction of the thinking capacity of the human brain can go a long way in several fields that are extremely relevant to automotive applications. Examples include advanced driving assistance systems #ADAS as well as the on-board analysis of speech and video data, which can unlock major advances in how we communicate with our carsā€.

I’m also very excited that he says ā€œ As AI and machine learning take on an increasingly important role in the software-defined vehicle, the energy this consumes is likely to become a critical.ā€

Cant wait to read the latest ā€œfindingsā€ in the upcoming ā€œIn the Loopā€ to hear his thoughts on where this is taking us. This suggests to me that they have got numerous use cases to discuss.
 
  • Like
  • Fire
  • Love
Reactions: 41 users

Sirod69

bavarian girl ;-)
I love it!🄰😘

brn.jpg
 
  • Like
  • Fire
  • Love
Reactions: 40 users

cassip

Regular
Hi @cassip, it’s a pity if that’s the case. I thought the point that Markus was trying to make was there’s still a long way to go to making an actual ā€œbrain on a chipā€, meaning a chip that has exactly the same functioning power as the human brain which has 80-100 billion neurons. That’s a very high bar to set and who knows if it will ever be achieved.

But he does go on to state that advancements such as those being made by BrainChip and Intel (1 million neurons) are ā€œimpressiveā€ and he elaborates by saying ā€œThe thing is, even a tiny fraction of the thinking capacity of the human brain can go a long way in several fields that are extremely relevant to automotive applications. Examples include advanced driving assistance systems #ADAS as well as the on-board analysis of speech and video data, which can unlock major advances in how we communicate with our carsā€.

I’m also very excited that he says ā€œ As AI and machine learning take on an increasingly important role in the software-defined vehicle, the energy this consumes is likely to become a critical.ā€

Cant wait to read the latest ā€œfindingsā€ in the upcoming ā€œIn the Loopā€ to hear his thoughts on where this is taking us. This suggests to me that they have got numerous use cases to discuss.
Hi @Bravo,

thank you for your reply. I agree. Had just expected that there was more attention and excitement today after his article and tried to find an explanation.

It is a great statement and how to interpret it is just like @Diogenese commented imo.

Cheers
Cassip
 
  • Like
  • Fire
Reactions: 11 users

Getupthere

Regular

Dell launches latest PowerEdge servers with newest Intel processors


Dell has launched its latest Dell PowerEdge Servers using the latest family of 4th Gen Intel Xeon Scalable processors.


The Round Rock, Texas-based company said the new servers offering performance that is up to 2.9 times greater AI inferencing.


Rajesh Pohani, a Dell vice president, said in a press briefing that it’s all about energy efficiency, security, reliability and digital transformation. He said that that majority of IT managers are planning to deploy more technology at the edge as well as the cloud.


Dell is showing off 13 models of its next-generation Dell PowerEdge servers, designed to accelerate performance and reliability for powerful computing across core datacenters, large-scale public clouds and edge locations.


It is unveiling rack, tower and multi-node PowerEdge servers, with Dell software and engineering advancements, such as a new Smart Flow design, to improve energy and cost efficiency. Expanded Dell APEX capabilities will help organizations take an as-a-service approach, allowing for more effective IT operations that make the most of compute resources while minimizing risk, the company said.


ā€œWe’re refreshing the core portfolio with the latest core technology likeā€ the Intel and Advanced Micro Devices chips, Pohani said.


New Dell PowerEdge servers are designed to meet the needs of a range of demanding workloads from AI and analytics to large-scale databases. The expanded portfolio announced in November 2022, including the PowerEdge XE family of servers with Nvidia H100 Tensor Core GPUs and the Nvidia AI Enterprise software suite for a full stack, production AI platform builds on advancements in artificial intelligence and machine learning.


New servers for cloud service providers


Dell introduced its PowerEdge HS5610 and HS5620 servers delivers optimized solutions tailored for cloud service providers managing large-scale, multi-vendor data centers. Available in both 1U (one unit) and 2U form factors, these new, two-socket servers include cold aisle serviceable configurations and are available with Dell Open Server Manager, an OpenBMC based systems management solution to simplify multi-vendor fleet management.


The servers provide improved performance, including the Dell PowerEdge R760, which delivers up to 2.9 times greater AI inferencing on 4th Gen Intel Xeon Scalable processors with Intel Deep Learning Boost and Intel Advanced Matrix Extensions.


The PowerEdge R760 also offers up to a 20% increase in VDI users3 and over 50% more SAP Sales & Distribution users on one server, compared to the previous generation. PowerEdge systems may be ordered with Nvidia Bluefield-2 data processing units to provide additional offload, acceleration and workload isolation capabilities idea for power efficiency for private, hybrid and multicloud deployments.


Dell has added monitoring software and new services to make server management easier. It has Dell CloudIQ, ProDeploy services, iDRAC9, and more to make it easier to deploy systems.


ā€œWith improvements in genomic sequencing technology and new methods in the lab driving data growth, data flows will continue to expand in the future. To ensure our continued innovation, we need to process data quickly and efficiently,ā€ said Pete Clapham, informatics support group leader at Wellcome Sanger Institute, in a statement. ā€œDell PowerEdge servers are well-designed, have built-in security, and deliver the performance that allows us to accelerate scientific discovery and bring innovation to the world faster.ā€


Designed for sustainability


Dell PowerEdge servers are designed with sustainability in mind, offering customers a three times performance improvement, compared to 14th Generation PowerEdge servers with Intel Xeon Scalable processors launched in 2017, resulting in less floor space required and more powerful and efficient technology across all next-generation systems.


The features include Dell Smart Flow design, Dell OpenManage Enterprise Power Manager 3.0 software, and Dell OpenManage Enterprise Power Manager 3.0 software.


ā€œToday’s modern data center requires continuous performance improvements for complex workloads such as AI, ML and VDI,ā€ said Kuba Stolarski, research vice president at IDC Enterprise Infrastructure Practice, in a statement. ā€œAs data center operators endeavor to keep up with the demand from these resource hungry workloads, they must also prioritize environmental and security goals. With its new Smart Flow design, coupled with enhancements to its power and cooling management tools, Dell offers organizations significant improvements in efficient server operation alongside the raw performance gains in its newest generation of servers.ā€


Reliability and security at the core


PowerEdge servers help accelerate Zero Trust adoption within organizations’ IT environments. The devices constantly verify access, assuming every user and device is a potential threat. At the hardware level, silicon-based hardware root of trust, with elements including the Dell Secured Component Verification (SCV), helps verify supply chain security from design to delivery. Additionally, multifactor authentication and integrated iDRAC verifies users before granting access.


A secure supply chain also enables customers to advance their Zero Trust approach. Dell SCV offers cryptographic verification of components, which extends supply chain security to the customer’s site.


The servers are debuting in February and April, with compute services arriving in the second half of 2023.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

wilzy123

Founding Member
A computing hardware approach aspires to emulate the brain.

An article in this month's Physics Today magazine featuring Loihi and SpiNNaker.
 

Attachments

  • pt.3.5155.pdf
    872.4 KB · Views: 207
Last edited:
  • Like
  • Love
  • Fire
Reactions: 8 users

Learning

Learning to the Top šŸ•µā€ā™‚ļø
Just (ICYMI) BrainChip sharing Markus Schafer’s Post, Great advertisement! Thanks Markus.

It's had been mentioned, but fantastic for a CTO of Mercedes referencing Neuromorphic Computing to only 'BrainChip and Intel' šŸ˜ŽšŸŽ‰šŸ„³

Screenshot_20230118_033537_LinkedIn.jpg


Learning šŸ–
 
  • Like
  • Love
  • Fire
Reactions: 47 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 14 users

BaconLover

Founding Member
Good morning everyone.

Can we stop sharing Motley Fool articles here?

BRN holders share it everywhere and then we complain that picklebro writes non stop about Brainchip.

They get maximum exposure with Brainchip when holders write and discuss their articles constantly. They have every reason to write it because we create the buzz, make a platform and share it far and wide.

Stop posting their bs articles here and then the buzz goes away and they'll find another company.

This is my "soft" opinion only, you can continue to share it if you wish, thought I'd make the suggestion šŸ˜‰ .
 
  • Like
  • Love
  • Fire
Reactions: 120 users

Sirod69

bavarian girl ;-)
Good morning everyone.

Can we stop sharing Motley Fool articles here?

BRN holders share it everywhere and then we complain that picklebro writes non stop about Brainchip.

They get maximum exposure with Brainchip when holders write and discuss their articles constantly. They have every reason to write it because we create the buzz, make a platform and share it far and wide.

Stop posting their bs articles here and then the buzz goes away and they'll find another company.

This is my "soft" opinion only, you can continue to share it if you wish, thought I'd make the suggestion šŸ˜‰ .
I was thinking the same thing today. I haven't even read it.

Was a great thing from Markus SchƤfer today, although the market in Germany didn't react too much. At Lang&Schwarz we are at +1.47% (0.448), Tradegate was at +4.27 (0.454 EUR). I wish you a good morning and a nice day. šŸ™‹ā€ā™€ļø
 
  • Like
  • Love
  • Fire
Reactions: 30 users
Traders can’t predict the market, but maybe their faces can..

In the AGE just now this morning. Note the University and the Technology involved. Also the timelines.

"The saying goes that our eyes are the window to the soul. Perhaps over time they’ll serve a less romantic purpose, as windows to making money.

Researchers at Carnegie Mellon University in Pittsburgh, one of the leading institutions for artificial-intelligence research, have embarked on a study using facial-recognition algorithms to track the expressions of traders. Their goal: finding correlations between mood swings and market swings. If the traders look enthusiastic, it might be time to buy. Are there more furrowed brows than usual? Could be time to sell. The provisional US patent application was filed on September 13, 2022"

See link for full article and more details.
Traders can't predict the market ..

Maybe Akida will come to our rescue as shareholders creating value in ways we don't image
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 25 users

Lex555

Regular
Musk Ox is planning on using software NNs.

Musk Shares Details on FSD Beta v11: Neural Nets to Be Used for Vehicle Control

January 15, 2023
By Nuno Cristovao


https://www.notateslaapp.com/softwa...11-neural-nets-to-be-used-for-vehicle-control

...

Neural Nets for Vehicle Behavior​

A week ago Musk said this upgrade will include 'many major improvements.' Last night Musk revealed some additional details. He said there will be "many small things," one of which will be that Tesla will begin to use neural nets for vehicle navigation and control, instead of just vision.

Today Tesla uses neural networks to determine the vehicle's surroundings, where objects are, what they are, and their distances from the vehicle to create a 3D environment known as 'vector space.' With this information, the vehicle can then plan a path and navigate around these objects toward its destination.

However, based on Musk's comment, it sounds like Tesla is currently only using neural nets to determine its environment and not for controlling the vehicle. This means that how the vehicle behaves, how it finds a path, and how it moves is still a process that is coded traditionally.

In the same way that Tesla uses millions of images to determine what a stop sign or traffic cone is, it sounds like Tesla will now use a large number of examples to determine how to best control the vehicle in various situations
.

Surely he can't be doing mission critical functions on the internet.

Sounds like Akida could improve Tesla mileage by 100 km or more.
Interesting indeed dio. If Akida improved efficiency by 100km a manufacturer such as Tesla could reduce pack size by ~20% for same range. As of 2022 cost of battery pack was approximately $138USD per kWh.

Meaning for a Model S, 20kWh could be reduced which would save $2760. That’s big money when manufacturing millions of cars a year.
 
  • Like
  • Love
  • Fire
Reactions: 24 users

Bravo

If ARM was an arm, BRN would be its bicepsšŸ’Ŗ!
Greetings Groovy People,

Check out this article published a few hours ago. This mentions the partnership between Prophesee and Datalogic and also discusses other companies using event-based vision systems such as Sony and Nikon. Because such systems can significantly improve efficiency and increase the amount of data collected it is going to be indispensable in automating a range of manufacturing processes, including counting, quality inspection, and predictive maintenance. It says here " As investment in this space increases, the market is expected to drive growth in other industries at an exponential rate through at least 2030".

Sweet!šŸÆ





Event-Based Vision: Where Tech Meets Biology​

January 17, 2023
By Brad MarleyContributing Editor
Vision-System.jpg

Machine vision systems are helpful in automating a range of manufacturing processes, including counting, quality inspection, and predictive maintenance. However, most vision systems in use today rely on frame-based image capture technology that has been around for more than a hundred years.
The next iteration in vision systems relies on what changes in a particular scene, or a specific ā€œeventā€ that happens. The technology takes cues from human biology, namely how efficiently eyes work to process massive amounts of visual data. Event-based vision is based on neuromorphic computing in which machines process information much like how the brain processes information. This can significantly improve efficiency and increase the amount of usable data collected.

A Smart Catalyst for Change​

ā€œThe introduction of 3D- and AI- (artificial intelligence) based vision systems have really changed the game when it comes to event-based vision systems in the manufacturing space,ā€ said Dan Simmons, senior sales engineer, Datalogic, which has its U.S. headquarters in Eugene, Ore. ā€œBefore you had to know how to program a system to do what you wanted it to do. The introduction of AI helps create a vision system that helps you learn what is ā€˜good’ and what is ā€˜bad.ā€™ā€
Simmons explained that AI learns the deviations from good and bad, and from there it makes better determinations without having to wait for a human operator to step in and make changes as it goes. But an AI system is only as good as the images you give it, he added.
On the factory floor, there are three application categories for event-based vision systems. The first lies in optical character recognition. In this scenario, the camera is being used for traceability. The AI can view characters that normally cannot be viewed with a standard vision system that is capable of optical character recognition. This could mean anything from reading characters on a non-flat surface to characters that have a very low print quality.
The second category focuses on error proofing. When it comes to quality, mistakes aren’t tolerated if a company wants to ensure success and produce the products its customers have come to expect—even if we’re just talking about junk food.
ā€œI remember hearing about a use case where a food manufacturer wanted to ensure the right cookie was being placed in the right bag during its trip down the factory line,ā€ Simmons said. ā€œThe company taught the vision system to simply recognize that there was writing on the package.ā€
He explained that it came down to the contrast between black and white, with the system able to decipher black writing on the white bag to let it proceed. If no writing was detected, the process was stopped.
The third application falls within cameras that require calibration, such as vision-guided robotics or applications that require measurement, such as an outer diameter measurement.
For example, machines can be taught to view the gap between spark plugs to ensure width accuracy. Then it becomes a simple pass or fail report if the gap isn’t accurate. The manufacturer can then archive that data to use when teaching a next-generation system what it needs to know for a similar job.
Understanding that manufacturers don’t always have the time or capability to facilitate these teachable moments, Datalogic rolled out its IMPACT Robot Guidance system that helps customers take advantage of smart robots by quickly and easily interfacing between any smart camera or vision processor.
ā€œOur IMPACT software is proven to let users solve not only guidance, but many other machine vision applications with an intuitive drag and drop interface,ā€ said Simmons. ā€œWith more than 100 vision tools, our customers won’t have to fret about not finding a guidance system that fits their needs.ā€
automotive-quality-control.jpg
The APDIS Laser is a fast, fully automated, non-contact inspection replacement to traditional CMMs for automotive quality control on the shop floor. (Provided by Nikon Metrology)

Making an Impact on Metrology​

Proper measurement is vital when it comes to quality assurance and part calibration to help mitigate risk and ensure parts are built to proper specifications.
One company thriving in the metrology space is Nikon, a name most would recognize as a producer of high-end cameras, camera lenses, and microscopes. Whereas once you saw cameras hanging around the necks of tourists and amateur photographers, the phones in our pockets have taken over, leaving Nikon with a gap to fill in its offerings.
Nikon brings more than one hundred years of experience in lenses and scopes. The company revolutionized quality control and metrology across a wide range of clients, using innovative techniques such as laser-radar systems to help automotive companies, for instance, measure gaps between door frames and window holes in automobile frames.
ā€œThe benefits of an event-based vision system are very similar to what we offer our manufacturing customers,ā€ said Pete Morken, senior application engineer, Nikon Metrology, which has a U.S. office in Brighton, Mich. ā€œOur systems help to measure whether or not a part is good or bad simply by scanning a car body on the assembly line.ā€
With Nikon’s laser-radar stations, manufacturers can measure the geometry of parts—car doors, whole car chassis, etc.—as alternatives to the slow, lumbering horizontal-arm coordinate measurement machine systems (CMMS).
In a typical CMMS, information is gathered slowly offline by the software and stored in a database where it can be accessed later when decisions about non-conformances need to be made. But where it lacks the ability of a laser-radar system is the speed that a company like Nikon can offer to make sure that information is used more efficiently.
ā€œWith our laser-radar system, the measurement that our customers obtain can be collected, analyzed, and reported more quickly, using more data, to see improved process quality,ā€ Morken said. ā€œThe use of pre-defined positions eliminates the requirement for further programming after installation, so the measurement program can happen immediately and continuously.ā€
The camera company is well-positioned to improve measurement possibilities for customers.
ā€œNikon has always lived at the cutting edge of technology, even as far back as its photography advances that re-shaped how we take pictures,ā€ added Morken. ā€œBringing in an event-based vision system could do for metrology what the company once did for budding photographers.ā€
As technology advances, companies are starting to see how combining artificial intelligence with vision systems represents that next iteration of this process, and how it can re-shape how manufacturers are able to view products and parts.
At its core, a vision system enables machines to ā€œseeā€ necessary objects, whether it’s a part in a bin or package of cookies. In the past, companies would have to teach the machine the parts or products it needed to scan, and the machine was then limited by what it had learned. If there was a flaw in the product, the machine might not know it was an imperfection because it wasn’t taught to recognize it.
As previously noted, artificial intelligence and machine learning can be used to teach manufacturing systems the difference between good and bad parts. As a result, algorithms become less important while the AI does most of the work.
Clean_MV430.jpg
Nikon Metrology’s APDIS Laser Radar.

Seeing is Collecting​

Audio-video giant Sony is working to take vision systems to the next level. Similar to Nikon’s transformation, Sony aims to carve out its spot in the event-based vision system industry by creating sensors that act like retinas in the human eye.
The tiny sensors are becoming ever smaller, which allows more of them to be fitted on a device to boost data collection volumes. The use of these sensors goes far beyond the manufacturing floor. As the technology improves, Sony sees deployment within collision avoidance systems, drones, and event-based 3D cameras.
Sony recently introduced what it touts as the world’s first intelligent vision sensors equipped with AI processing functionality. One highlight: The new chip will be to identify people and objects.
This would allow cameras with the chip to identify stock levels on a store shelf or use heat maps to track and analyze customer behavior. It could even count and forecast the number of customers in a given location, providing valuable data to calculate when foot traffic is highest.
Where the technology stands to shine the most in manufacturing is around data management. Advanced sensors can identify objects and send a description of what they see without having to include an accompanying image that takes up space in the database. This could reduce storage requirements by up to 10,000 times, leaving companies with more space to gather critical data that they previously haven’t been able to access, while giving AI a looser leash to capture relevant information.

Working Together​

As technology evolves, partnerships between companies in the event-based vision system space and those that want to deploy across other industries will become commonplace.
Datalogic is joining forces with Paris-based Prophesee, a company that invented advanced neuromorphic vision systems and is working to build the next generation of industrial products.
ā€œWe are conducting a very fruitful partnership with Prophesee,ā€ said Michele Benedetti, chief technology officer at Datalogic. ā€œNeuromorphic vision is a fascinating technology inspired by the behavior of the human biological system, exactly like neural networks. We believe that the combination of these technologies will provide innovative solutions to our customers.ā€
As investment in this space increases, the market is expected to drive growth in other industries at an exponential rate through at least 2030, according to a Grand View Research report on the U.S. machine vision market. The increasing demand for quality inspection, as well as the need for vision-guided robotic systems, is expected to fuel that growth.
While long-term forecasts for emerging technologies are far from an exact science, the future for event-based vision systems looks promising—giving manufacturers cause to be fitted with a pair of 20/20 rose-colored specs.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 54 users

Learning

Learning to the Top šŸ•µā€ā™‚ļø
More Free Advertising!


There is a short podcast at the above link also

BrainChip Holdings Ltd. (ASX:BRN) has published a paper titled "Benchmarking AI Inference at the Edge: Measuring Performance and Efficiency for Real-World Deployments," which "evaluates the current state of edge AI benchmarks and the need to continually improve metrics that measure performance and efficiency of real-world, power-conscious edge AI deployments." Anil Mankar, the company's Chief Development Officer, explained"

"While there's been a good start, current methods of benchmarking for edge AI don't accurately account for the factors that affect devices in industries such as automotive, smart homes and Industry 4.0. We believe that as a community, we should evolve benchmarks to continuously incorporate factors such as on-chip, in-memory computation and model sizes to complement the latency and power metrics that are measured today."
--------
A report published by Brand Essence Research finds that the global market for Conversational AI is projected to grow from $8.24 billion USD in 2022 to $32.51 billion by 2028, registering a compound annual growth rate (CAGR) of 21.6 percent in the forecast period. The following excerpt from the report's summary outlines the role of COVID-19 in influencing the market's growth:
--------
Learning šŸ–
 
  • Like
  • Fire
  • Love
Reactions: 48 users

31D28A18-6EE1-4A4C-9936-F2CC50952793.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 39 users

Learning

Learning to the Top šŸ•µā€ā™‚ļø
Not directly about BrainChip. But we know Brainchip's partner Prophesee is working with Sony;


Counterpoint: Sony's smartphone camera sensor business is on the rise thanks to iPhone upgrades
PETER, 17 JANUARY 2023

The smartphone market shrunk in 2022, which impacted the suppliers of image sensors. Some fared better than others though – Sony was the only supplier to see its revenue grow on a yearly basis.

And it was mostly thanks to Apple upgrading the cameras on the iPhone 14 series. The two Pro models brought new 48MP sensors in the main cameras and larger 12MP sensors in the ultra wide cameras. The selfie cam on all four models was upgraded with autofocus too. Apple exclusively uses Sony sensors, you can see the breakdown by camera type below:.
Screenshot_20230118_061510_Chrome.jpg

The Sony sensors inside the last two generations of Apple iPhonesThe Sony sensors inside the last two generations of iPhones (source: Counterpoint BoM analysis service)
Adding it all together, Sony made an extra $6 per unit for a total of around $300 million in the second half of 2022. The end result is that Sony took in 54% of the total revenue for the year, up 5 percentage points compared to 2021.

Samsung LSI did well for itself, even though its revenue share contracted by 1 percentage point to 29%. The company raked in the benefits of high resolution, small pixel size sensors (sub-0.7µm pixels).

The affordable 50MP sensors proved quite popular and Samsung shipped an estimated 200 million of them in 2022. These are used in the main cameras of lower end phones and in the selfie cameras of more premium devices. The company still dominates the 100+ megapixel sensor market and shipped an estimated 150 million units since it launched the first one.
Screenshot_20230118_061602_Chrome.jpg

Last year the smartphone image sensor market contracted by 6% compared to 2021, but the total revenue remained above $13 billion. Sony and Samsung took in the lion’s share of that, 83% in total.

Source

Learning šŸ–
 
  • Like
  • Fire
  • Love
Reactions: 30 users

Moonshot

Regular
2023 year of the Akida Spikformer?

Conclusion

In this work we explored the feasibility of implementing the self-attention mechanism and Transformer in Spiking Neuron Networks and propose Spikformer based on a new Spiking Self-Attention (SSA). Unlike the vanilla self-attention mechanism in ANNs, SSA is specifically designed for SNNs and spike data. We drop the complex operation of softmax in SSA, and instead perform matrix dot- product directly on spike-form Query, Key, and Value, which is efficient and avoids multiplications. In addition, this simple self-attention mechanism makes Spikformer work surprisingly well on both static and neuromorphic datasets. With directly training from scratch, Spiking Transformer outperforms the state-of-the-art SNNs models. We hope our investigations pave the way for further research on transformer-based SNNs models.

https://arxiv.org/pdf/2209.15425.pdf
 
  • Like
  • Fire
  • Love
Reactions: 20 users

BaconLover

Founding Member
Researchers at Carnegie Mellon University in Pittsburgh, one of the leading institutions for artificial-intelligence research, have embarked on a study using facial-recognition algorithms to track the expressions of traders. Their goal: finding correlations between mood swings and market swings. If the traders look enthusiastic, it might be time to buy. Are there more furrowed brows than usual? Could be time to sell. The provisional US patent application was filed on September 13, 2022"

I like Mellon.


ā€œWe have incorporated experimentation with BrainChip’s Akida development boards in our new graduate-level course, ā€œNeuromorphic Computer Architecture and Processor Designā€ at Carnegie Mellon University during the Spring 2022 semester,ā€ said John Paul Shen, Professor, Electrical and Computer Engineering Department at Carnegie Mellon. ā€œOur students had a great experience in using the Akida development environment and analyzing results from the Akida hardware. We look forward to running and expanding this program in 2023.ā€

 
  • Like
  • Love
  • Fire
Reactions: 60 users
Top Bottom