BRN Discussion Ongoing

Diogenese

Top 20
I agree completely and may I just add I think most here quietly agree with you, I know I do.
Trump is the dickhead the world never knew it needed.
I'd move the "never" along a couple of slots.
 
  • Haha
  • Like
  • Fire
Reactions: 24 users

Rangersman

Member
  • Like
Reactions: 5 users

7für7

Top 20
In germany -3 % 😂😂
 
  • Like
Reactions: 1 users

equanimous

Norse clairvoyant shapeshifter goddess

View attachment 81975
Here are some innovative deployment ideas where this combination could make a significant impact:
  1. Wildfire Monitoring and Response
    Deploy drones equipped with this technology to monitor vast forested areas for early signs of wildfires. The event-based vision system could detect smoke or sudden temperature changes in real-time, while the low-power neuromorphic processing enables extended flight times. Emergency teams could receive precise coordinates of fire outbreaks, allowing faster containment efforts, especially in remote or rugged terrain where traditional surveillance struggles.
  2. Urban Search and Rescue (USAR)
    In disaster scenarios like earthquakes or building collapses, these drones could navigate debris-filled environments to locate survivors. The high-speed detection and tracking capabilities would identify subtle movements—such as a trapped person waving or shifting rubble—under challenging lighting conditions (e.g., dust, darkness). The lightweight, power-efficient design ensures prolonged operation without frequent recharging, critical in time-sensitive missions.
  3. Avalanche and Mountain Rescue
    Equip drones to patrol snowy mountain regions, detecting skiers or climbers caught in avalanches. The event-based sensors could spot rapid changes in the snow surface or human motion against a static background, even in blizzard conditions where conventional cameras fail due to blur or low visibility. This would enhance the speed and accuracy of rescue operations in harsh alpine environments.
  4. Flood Detection and Victim Assistance
    Use the technology to monitor flood-prone areas, identifying rising water levels or stranded individuals in real-time. The drones could track moving objects—like people or vehicles—across large, dynamic flood zones, providing emergency services with actionable data to prioritize rescue efforts. The low-power advantage allows continuous monitoring during prolonged flood events.
  5. Traffic Accident Response
    Deploy drones over highways or urban roads to detect accidents as they occur. The system’s ability to process high-speed events could identify sudden vehicle stops, collisions, or pedestrians in distress, relaying alerts to paramedics and police instantly. This rapid response capability could reduce secondary accidents and improve survival rates in critical situations.
  6. Hazardous Material Spill Detection
    In industrial or chemical emergencies, drones could scan for spills or leaks by detecting unusual motion (e.g., liquid spreading) or changes in infrared signatures. The neuromorphic system’s efficiency ensures safe, prolonged operation near hazardous sites, providing real-time updates to containment teams without risking human exposure.
  7. Crowd Safety at Large Events
    Monitor festivals, protests, or sports events for emergencies like stampedes or medical incidents. The event-based approach excels at tracking fast-moving crowds and spotting anomalies—like a person collapsing—across wide areas. Emergency responders could be directed to exact locations quickly, improving crowd management and safety.
  8. Border and Perimeter Security
    Enhance surveillance along borders or sensitive installations by detecting unauthorized movements in real-time. The technology’s low power consumption supports long-duration patrols, while its ability to function in low-light or adverse weather conditions ensures reliable detection of intruders or suspicious activity, aiding rapid response by security forces.
  9. Medical Delivery in Crisis Zones
    Equip drones to deliver critical supplies (e.g., defibrillators, medicine) to remote or disaster-stricken areas. The event-based sensing could guide navigation through chaotic environments—avoiding obstacles and identifying landing zones—while the neuromorphic processor optimizes battery life, ensuring more deliveries per charge.
  10. Wildlife Rescue and Poaching Prevention
    Use drones to monitor endangered species or detect poaching activities in vast nature reserves. The system could track animal movements or spot human intruders at night, providing rangers with real-time alerts. Its energy efficiency makes it ideal for extended missions in off-grid wilderness areas.

Grok.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Diogenese

Top 20
Meta (Facepalm)

Meta is being excoriated before US Senate by whistleblower:



So I thought I'd check on their AI tech:

Llama 4​

Introduction
The Llama 4 Models are a collection of pretrained and instruction-tuned mixture-of-experts LLMs offered in two sizes: Llama 4 Scout & Llama 4 Maverick. These models are optimized for multimodal understanding, multilingual tasks, coding, tool-calling, and powering agentic systems. The models have a knowledge cutoff of August 2024.

"a knowledge cutoff of August 2024" - ie, no ML.
Not well said but said well lol.
Mutatis mutandis
 
  • Like
  • Wow
  • Fire
Reactions: 6 users

Frangipani

Top 20
Miguel Lopez, Head of Robotics at ARQUIMEA Research Center (ARC), posted this earlier today:



D8953B6C-E99D-4CAB-A32F-59ED1DA5A264.jpeg


62AE2D28-AFC5-41BE-8CA6-F3C6B3F01FCE.jpeg

87C03D35-F732-4303-BBB7-937D35E5B617.jpeg

99C0F62C-8F7D-435F-BEB6-B5A7839BFFDF.jpeg



His own name and the names of the researchers in his ARC robotics perception team, which he congratulates on their “fantastic work” in the above post, match with more than half of the co-authors of the research paper “Real-Time Beach Monitoring: Addressing Beach Safety with UAVs and Computer Vision” that I shared last month.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-454042

While the paper is not open access and I thus still cannot verify whether or not it already mentions the event-based Prophesee & Akida drone solution, it is at least quite likely that the other co-authors would already have been aware of Akida at the time of publication (which appears to be 1 January 2024; the paper was subsequently presented at the 2024 7th Iberian Robotics Conference at Universidad Politécnica de Madrid in early November 2024, cf. page 16 of the PDF programme accessible on https://eventos.upm.es/109808/detail/robot-2024-.html), given Miguel Lopez wrote that “This #research project has been in the making for quite some time now…” - and even if not, well, they are definitely now. 😊



696819D2-A687-4F1E-86B4-71CE4135DA18.jpeg



All co-authors not underlined are affiliated with the research group “Computer Vision and Aerial Robotics” (CVAR) within the Centre for Automation and Robotics (CAR) at Universidad Politécnica de Madrid (UPM), headed by Pascual Campoy, who has been a full professor at UPM for more than 42 years and has been involved in more than 40 R&D projects, including over 25 technical transfer projects directly contracted with the industry.

9E647F53-C352-4C44-A907-E603534DAF65.jpeg

1C77D690-4D57-45D7-8092-2790D602BA88.jpeg



The paper’s first author is Rodrigo Da Silva Gómez, a UPM student:

11B6F8E4-4640-4B99-B7E4-205DA84FD596.jpeg



The paper’s remaining three co-authors are two current PhD students and a former PhD student at CVAR-UPM who is now an AI consultant and freelancer:

A3FF964D-CD28-4323-8BCF-D3DEAC84330F.jpeg

1FC767C6-0662-4ED7-A0C3-4BDE09F95652.jpeg


The way I see it, that bodes well for the CVAR researchers at UPM (which ranks as Spain’s top technical university, by the way) potentially using our technology for current or future technical transfer projects with industry partners.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 38 users

JoMo68

Regular
Enough is Enough. I want everyone to stop turning this forum into a place to voice their opinions on Trump. We all have different political opinions but this is not the place for such a debate. Keep your opinions to yourself.
Trump is the leader that citizens from every country wish they had. A no nonsense president. A president who is an expert in the art of negotiation. Basically a legend.
Enough said. Let’s keep this thread free of political bias and opinions.
Twitch Quote GIF by Hyper RPG
 
  • Haha
  • Love
  • Like
Reactions: 11 users

7für7

Top 20
Here for my political friends! Good night

 
  • Haha
  • Fire
  • Like
Reactions: 9 users

FJ-215

Regular

equanimous

Norse clairvoyant shapeshifter goddess
Just wanted to share a quick recommendation in this BRN Discussion—if you're in need of window glass repair in Holsworthy, there’s a local team that really knows their stuff. Fast, friendly, and reliable service every time I’ve used them. Definitely worth checking out—visit website for more details.
Most likely a scam dont click link
 
  • Like
Reactions: 5 users
Looks like most of yesterday gains will be lost today.
 
  • Like
  • Sad
Reactions: 3 users

equanimous

Norse clairvoyant shapeshifter goddess

IDS Launches New Event Based Cameras​

Cameras designed for machine vision applications performed at very high speeds, including vibration monitoring, bin picking, object tracking.
April 10, 2025
2 min read
Photo/IDS
IDS Imaging Development Systems GmbH has launched a new camera in its uEye camera line, the uEye EVS. The camera has a USB3 interface that achieves 5 Gbps speeds. It is equipped with a Prophesee-Sony IMX636 CMOS sensor, which is designed to capture only relevant motion in an image. The sensor has .92 MPixel, 1280 x 720 resolution, is light sensitive up to 40mlux with 120dB HDR, performs scene-controlled events with 1μs time resolution, and utilizes less data—between 10x to 1000x less—than image-based sensors.
The uEye EVS cameras are especially designed machine vision applications requiring very high speed image processing, such as optical monitoring of vibrations, high speed motion analyses, bin oicking and depalletization, and object tracking.
To Learn More:
Contact: IDS Imaging Development Systems GmbH
Headquarters:
Obersulm, Germany
Product: uEye EVS camera
Key Features: Prophesee-Sony IMX 636 CMOS sensor, .92 MPixel, 1280 x 720 resolution, light sensitive up to 40mlux
What IDS says: View more information on uEye EVS camera series
Share your vision-related news by contacting Linda Wilson, Editor in Chief, Vision Systems Design.
 
  • Like
  • Love
  • Thinking
Reactions: 9 users

Papacass

Regular
Miguel Lopez, Head of Robotics at ARQUIMEA Research Center (ARC), posted this earlier today:



View attachment 82001

View attachment 82000
View attachment 82002
View attachment 82003


His own name and the names of the researchers in his ARC robotics perception team, which he congratulates on their “fantastic work” in the above post, match with more than half of the co-authors of the research paper “Real-Time Beach Monitoring: Addressing Beach Safety with UAVs and Computer Vision” that I shared last month.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-454042

While the paper is not open access and I thus still cannot verify whether or not it already mentions the event-based Prophesee & Akida drone solution, it is at least quite likely that the other co-authors would already have been aware of Akida at the time of publication (which appears to be 1 January 2024; the paper was subsequently presented at the 2024 7th Iberian Robotics Conference at Universidad Politécnica de Madrid in early November 2024, cf. page 16 of the PDF programme accessible on https://eventos.upm.es/109808/detail/robot-2024-.html), given Miguel Lopez wrote that “This #research project has been in the making for quite some time now…” - and even if not, well, they are definitely now. 😊



View attachment 82011


All co-authors not underlined are affiliated with the research group “Computer Vision and Aerial Robotics” (CVAR) within the Centre for Automation and Robotics (CAR) at Universidad Politécnica de Madrid (UPM), headed by Pascual Campoy, who has been a full professor at UPM for more than 42 years and has been involved in more than 40 R&D projects, including over 25 technical transfer projects directly contracted with the industry.

View attachment 82019
View attachment 82021


The paper’s first author is Rodrigo Da Silva Gómez, a UPM student:

View attachment 82022


The paper’s remaining three co-authors are two current PhD students and a former PhD student at CVAR-UPM who is now an AI consultant and freelancer:

View attachment 82023
View attachment 82024

The way I see it, that bodes well for the CVAR researchers at UPM (which ranks as Spain’s top technical university, by the way) potentially using our technology for current or future technical transfer projects with industry partners.
Do you reckon this is why Brainchip employed the lifeguard from Huntington Beach to the sales team?
 
  • Like
  • Haha
Reactions: 8 users

Mccabe84

Regular
My memory could be wrong but didn't Tony Lewis or someone say that there was big/exciting news coming for BRN this month?
 
  • Like
  • Thinking
  • Fire
Reactions: 15 users

Iseki

Regular
Do you reckon this is why Brainchip employed the lifeguard from Huntington Beach to the sales team?
Maybe. Do you reckon the drone can detect rips and do something when it idenifies a swimmer in trouble like deploy a flotation device? That would be spectacular.
 
  • Like
Reactions: 4 users

Esq.111

Fascinatingly Intuitive.
Morning Chippers ,


& This article was interesting also , though un-related .


Regards,
Esq.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Morning Chippers ,


& This article was interesting also , though un-related .


Regards,
Esq.
Cheers, @Esq.111. Here's the link to the Wevolver article.........

 
  • Like
  • Fire
  • Love
Reactions: 9 users
  • Like
  • Love
  • Fire
Reactions: 30 users

7für7

Top 20
  • Haha
  • Like
Reactions: 3 users
Top Bottom