BRN Discussion Ongoing

Draed

Regular
Everything.... BRN announcement excited..... then "notification regarding unquoted security".... 😫😵‍💫😵
 
  • Haha
Reactions: 2 users

7für7

Top 20
Everything.... BRN announcement excited..... then "notification regarding unquoted security".... 😫😵‍💫😵
Maybe labsy should be more specific… like “a massive price sensitive announcement regarding huge gains”

@Labsy, try it again 🤷🏻‍♂️
 
  • Haha
  • Sad
  • Like
Reactions: 3 users

IloveLamp

Top 20
Totes agree.......🙃

1000016437.jpg
1000016440.jpg
1000016441.gif
 
Last edited:
  • Like
  • Love
  • Haha
Reactions: 27 users
Bit of a big banana, if he was the main guy in developing TENNs..

From the recent TENNs presentation, the presenter sounded like he was saying, there was a "main" person involved, who hated the name TENNs.

It sounds a little like Rudy is fishing there, in the last 2 paragraphs?..
 
  • Like
  • Thinking
Reactions: 3 users

IloveLamp

Top 20

1000016445.jpg
 
  • Like
  • Love
  • Fire
Reactions: 18 users

Labsy

Regular
Maaaaassssssive price sensitive announcement pushing sp to $3 +++++ pleeeeeeease..... pleeeeeeease oh infinity and beyond powers of the universe ....... Pleeease.....
Perhaps Zuckerberg, Elon or Bezos advocating the use of neuromorphic in their new tech ...............
Let's goooo!!!!!!! 🚀🚀🚀🚀🚀🚀👌🙏🙏🙏🙏

Edit: This week.......pleeeeeeease .....cmoooooon!!!! Yeaaaaaaah!! Woooo!!!
 
  • Like
  • Haha
  • Fire
Reactions: 34 users

toasty

Regular

View attachment 64961
Not a lot of point if they can't communicate with it..............
 
  • Like
Reactions: 1 users

IloveLamp

Top 20
Not a lot of point if they can't communicate with it..............

Yes because you would know better than them.......
1000012982.gif
 
Last edited:
  • Haha
Reactions: 12 users
Yes because you would know better than them View attachment 64962
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

20240617_150050.jpg


You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
 
  • Like
  • Fire
  • Thinking
Reactions: 9 users

Kachoo

Regular
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

View attachment 64963

You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
 
  • Like
Reactions: 5 users
No Optimus did not have the Beacon. How silly to have the beacon but not use it. I suspect that the design was well before the beacon came to be hence no Beacon.
You know that as fact Kachoo?
 
  • Fire
  • Like
Reactions: 2 users

Tothemoon24

Top 20

IMG_9119.jpeg

Navigating the Challenges and Complexities in Automotive Cameras​

Published on
June 11, 2024

blog Banner

Introduction​

The automotive industry is undergoing a technological revolution, with advanced driver assistance systems (ADAS) and self-driving vehicles becoming increasingly prevalent. Central to these innovations are automotive cameras, which serve as the “eyes” of modern vehicles. They play a critical role in enhancing safety, providing real-time data for navigation, and enabling autonomous functionalities.
Many envision self-driving cars as vehicles that effortlessly chauffeur them to their destinations, allowing them to use travel time for work, relaxation, or entertainment, much like riding a bus without any exhaustion. However, despite their advanced capabilities, automotive cameras face numerous challenges that impact their performance and reliability.
To make this vision a reality, the sensor systems in these vehicles must function reliably regardless of the time of day, weather conditions, lighting, and road conditions.
This article examines the challenges posed by the requirements for ADAS and self-driving vehicles and explores how VVDN Technologies addresses these challenges with their expertise in camera technology.

Types of Automotive Cameras​

  • Rear-View Cameras: Installed at the back of the vehicle, these assist drivers in reversing and parking by providing a clear view of the area behind the vehicle.
  • Surround-View Cameras: These systems combine images from multiple cameras placed around the vehicle to create a 360-degree view, enabling park assistance and maneuvering in tight spaces.
  • Forward-Facing Cameras: Positioned at the front, these cameras are crucial for ADAS functionalities like lane-keeping assistance, traffic sign recognition, and collision avoidance.
  • Driver Monitoring Cameras: These cameras monitor the driver’s attention and alertness, detecting signs of drowsiness or distraction to enhance safety.
  • eMirror Cameras: These cameras replace traditional side mirrors with advanced digital displays, offering improved visibility and reduced blind spots, even in challenging weather conditions.
  • Night Vision Cameras: Utilizing infrared technology, these cameras improve visibility in low-light conditions, helping detect pedestrians, animals, and other obstacles not visible with standard headlights.
cameraautomotive-1-1024x557.jpg

Challenges in Automotive Cameras​

  • Contamination: Camera lenses can be obstructed by dirt, water, or other contaminants. Solutions include protective covers that open only when necessary or positioning cameras behind windshield wipers.
  • Lane Detection: Recognizing lane markings is challenging due to similar-looking structures, regional differences, varying colors, and weather conditions.
  • Light Assistance: Distinguishing moving vehicles from static objects like streetlights and reflectors is complex, especially when dealing with partially defective lights or motorcycles.
  • High Dynamic Range: Cameras struggle with visibility in extreme lighting conditions, such as direct sunlight, tunnel exits, and oncoming headlights.
  • Flicker: Modern LED light sources flicker at different frequencies, complicating the continuous image analysis required for vehicle cameras.
  • Stray light: High contrasts can cause unwanted reflections and light scattering in the lens or camera housing, creating visibility issues.
  • Environmental Factors: Cameras must operate reliably in all temperatures (-40 to 100°C) and weather conditions (rain, fog, snow). Image noise increases with temperature, affecting performance.
  • Regional differences: Traffic signs and road markings vary by region, requiring cameras to recognize and adapt to these differences.
  • An infinite number of objects: The variety of objects and their changing perspectives pose a significant challenge for accurate classification and detection.

Limitations of Automotive Cameras​

Cameras have inherent limitations in three specific areas:
  • Limited field of view.
  • Struggles with accurate depth perception (>20m).
  • Reduced visibility in fog, rain, and low-light conditions.
To combat these limitations, vehicles with high automation incorporate additional sensors:
  • LiDAR: Offers precise 3D mapping and distance measurement, enhancing of object detection and spatial awareness.
  • RADAR: Complements cameras by providing robust detection of objects and accurate distance measurements, especially in adverse weather conditions.
  • Ultrasonic Sensors: Used for close-range detection, particularly useful for parking assistance systems.
  • Thermal Imaging Cameras: Used to detect living and heated objects, enhancing safety by identifying pedestrians and animals in low-visibility conditions.

Solutions to Overcome Challenges​

  • Advanced Algorithms: Implementing sophisticated image processing algorithms, including machine learning and AI, to improve object detection and recognition under various conditions.
  • Sensor Fusion: Integrating data from multiple sensors (cameras, radar, LiDAR) to create a more accurate and comprehensive understanding of the vehicle’s environment.
  • Robust Design: Designing camera systems with protective housings, self-cleaning mechanisms, and better optical design to prevent lens obstruction and enhance image quality.
  • Enhanced Calibration Techniques: Developing automated and dynamic calibration systems that adjust camera alignment in real-time, maintaining accuracy even after impacts or vibrations.
  • Data Integration: Combining data from multiple cameras and sensors to create a comprehensive understanding of the vehicle’s surroundings.
In addition to these solutions, several other parameters need to be optimized to ensure high-quality camera output before installation in vehicles. These include:
  • Measurement of OECF (Opto Electronic Conversion Function)
  • Noise
  • Resolution, Including Spherical Aberrations.
  • White Balance
  • Edge Darkening in Intensity and Color
  • Chromatic Aberration
  • Stray Light
  • Color Reproduction.
  • Defective Pixels and Inclusions on the Sensor
  • Flicker

VVDN Expertise in Camera Technology​

VVDN Technologies, with over a decade of experience in designing and developing cameras for various industries, including automotive, offers comprehensive in-house capabilities covering software design, hardware design, mechanical design, testing, validation, and manufacturing. Equipped with a world-class ISP tuning lab, VVDN performs rigorous objective and subjective testing to guarantee top-quality performance. Specializing in designing and integrating AI/ML models,VVDN has expertise in sensor fusion algorithms, combining data from LiDAR, RADAR, thermal, IR, UWB, and ultrasonic sensors.
By combining deep industry experience with comprehensive in-house capabilities and cutting-edge technology, VVDN Technologies delivers advanced automotive camera solutions that meet the highest standards of performance and reliability.
To learn more about our offerings and discuss how we can collaborate to meet your camera requirements, please contact us atinfo@vvdntech.com
 
  • Like
  • Fire
  • Love
Reactions: 14 users

rgupta

Regular
Did the Optimus Satellite, have Beacon on board, is the big question?..

ANT61 are encouraging organisations to use it, so it must be ready for implemention?

View attachment 64963

You would think that they would've taken the opportunity with the Optimus Satellite, as a first use case and not real good, if they couldn't use it, to establish a connection..

Or maybe it's all part of their plan..

"Hey, we just established contact with Optimus, using Beacon. Good thing we had that particular piece of hardware onboard!"..
No not there ant61 confirmed that
 
  • Like
Reactions: 5 users
Maaaaassssssive price sensitive announcement pushing sp to $3 +++++ pleeeeeeease..... pleeeeeeease oh infinity and beyond powers of the universe ....... Pleeease.....
Perhaps Zuckerberg, Elon or Bezos advocating the use of neuromorphic in their new tech ...............
Let's goooo!!!!!!! 🚀🚀🚀🚀🚀🚀👌🙏🙏🙏🙏

Edit: This week.......pleeeeeeease .....cmoooooon!!!! Yeaaaaaaah!! Woooo!!!
Dreamer
 

Kachoo

Regular
You know that as fact Kachoo?
That was confirmed to me that they did not have the Beacon installed would have been nice to trial.
 
  • Like
  • Fire
Reactions: 5 users

Oops if posted ……

IMG_1973.jpeg
 
  • Like
  • Fire
Reactions: 8 users

Taproot

Regular
You know that as fact Kachoo?

Read the comments section of todays Ant61 post.
The Beacon wasn't ready for Optimus.
Unfortunately a little ironic that the company that sent our Brain into space is the same company that has just successfully tested and proved up fail proof Satellite communication technology.
But don't stress, She'll be right. We just need to wait patiently for Labsy's blockbuster announcement, which should come any day now.
 
  • Like
  • Haha
  • Fire
Reactions: 7 users
Looks like plenty of funding going on for Australian space companies:


"Adelaide-based satellite tech company Myriota has confirmed a $1.5m investment from the Australian Space Agency will be used develop an off-world communications system.

The grant is part of a funding round as part of the Australian Space Agency’s Moon to Mars Initiative Demonstrator Mission program, first published in March."

"This project is another example of how Australian innovation can contribute to global space missions, while ultimately enhancing the critical technologies that can improve lives here on Earth,” says Enrico Palermo, head of the Australian Space Agency."

"the University of Western Australia ($4.4m) also received funding for a range of initiatives as part of the Moon to Mars program."
 
  • Like
  • Fire
  • Thinking
Reactions: 10 users

The Pope

Regular
Use the report button and stop arguing with the peanuts people ffs. Wasn't that the whole point of tsex?

Otherwise you might as well go and indulge them on the crapper
Yes because you would know better than them.......
View attachment 64962
 
  • Like
  • Love
  • Fire
Reactions: 9 users

Diogenese

Top 20
Regarding time-frames to market I grabbed this from a post almost two years ago. In this video of Arm's Rene Haas being interviewed by Bloomberg business channels Emily Chang, Mr Haas is saying that product timelines take 3-4 years. See specifically the 3 minute 40 second mark on the following video. I think Mr. Haas knows what he is talking about.

Although he states "3 to 4 years", I personally would think 4 years is likely more pertinent to Brainchips situation, so you are in the ballpark, The Pope.

That would make Renesas or Megachips, for example, producing something with Akida technology in it probably around 2026 I speculate. I hope I am wrong and that the revenue numbers Sean Hehir wants us all to watch start to increase exponentially before then.
As far as anything from Intel, and when.......who knows?


Thanks, ... dippY
Hi dippy,

The time-to-market will vary with the application.

for example, SynSense is already being used in toys, but Mercedes has indicated that AI in silicon is still some way off.

Even with automotive, I would think that something like the primary ADAS functions will require longer qualification times than something like "Hey Mercedes!". I would think that driver alertness would be considered more mission-critical as a legislated safety requirement in Europe. That said, Akida 2 has been available in software for a couple of years, and testing software may not be as difficult as testing hardware.

The good news is that we have been working with Valeo for 5 years, but the first 3 would have been with Akida 1. So the clock has been reset to 2022 for Akida2, but I'm hoping there will be a degree of carry-over from Akida 1.

Similarly, we have been working with MB for several years, and Akida 2 simulation software has been available for 2 years.

It is necessary to develop model databases to run NNs, and, as is becoming clearer, we seem to be working closely with Edge Impulse on this, using ML to automate the adaptation of user data to NN models on-chip, bypassing the conventional CNN route of reformatting the whole model in the cloud.

I suspect that Mercedes with its software defined vehicle is further advanced with the use of Akida simulation software in "Hey Mercedes!" and other in-cabin applications, and, as Magnus Ostberg advised, I'm "staying tuned!" I think the lure of TeNNs will prove persuasive.

Indeed, we also know that Valeo's SCALA 3 lidar comes with image processing software.

There are, of course, many applications which are not safety-critical which may adopt Akida silicon earlier than Automotive.
 
  • Like
  • Fire
  • Love
Reactions: 45 users
Top Bottom