Search
Menu
Opto Diode Corp. - Opto Diode 10-24 LB

Prophesee, Partners Deliver Field-programmable Gate Array Platform

Facebook X LinkedIn Email
Neuromorphic vision systems developer Prophesee and partners Digital Media Professionals (DMP) and Restar Electronics Corp. (REC) are teaming to develop an edge AI FPGA machine vision platform based on Prophesee’s Metavision Event-Based Vision technology. The collaboration will also include integration and data collection services.

Traditional frame-based machine learning techniques require huge data sets and computing power, and they have accuracy limitations due to the challenges of various lighting conditions and backgrounds. Prophesee develops event-based vision solutions that address issues caused by lighting conditions and background through sparse, continuous event input, enabling enhanced performance in how machine learning can be applied to computer vision.

This system will include an event camera using an event-based vision sensor, like Sony’s IMX636ES, which was realized in a separate collaboration. Prophesee recently launched an ultralight, compact HD evaluation kit for the Sony IMX636ES HD stacked event-based vision sensor. 
Appearance of the DMP ZIA C3 and Prophesee's event-based sensor camera. Courtesy of Prophesee.
Appearance of the DMP ZIA C3 and Prophesee's event-based sensor camera. Courtesy of Prophesee.
Event-based vision sensors feature pixels that are each powered by their own embedded intelligent processing, allowing the pixel to activate independently when it detects a relative change in illumination intensity. As event-based vision relies on independent receptors to collect essential information only, it functions much like the human eye and brain working together to overcome inherent limitation of conventional machine vision.

The main target use case for the jointly developed FPGA platform is in smart city applications for traffic monitoring, foot traffic, customer flow inside buildings, subway station safety, and smart building.

Prophesee has launched a third collaboration, with iCatch, a provider of security surveillance solutions, on event-based metavision sensing projects that integrated iCatch’s V57 AI vision processor with the stacked event-based vision sensor IMX636.
Edmund Optics - Manufacturing Services 8/24 MR

Published: April 2022
Glossary
machine vision
Machine vision, also known as computer vision or computer sight, refers to the technology that enables machines, typically computers, to interpret and understand visual information from the world, much like the human visual system. It involves the development and application of algorithms and systems that allow machines to acquire, process, analyze, and make decisions based on visual data. Key aspects of machine vision include: Image acquisition: Machine vision systems use various...
artificial intelligence
The ability of a machine to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning, and problem solving.
lidar
Lidar, short for light detection and ranging, is a remote sensing technology that uses laser light to measure distances and generate precise, three-dimensional information about the shape and characteristics of objects and surfaces. Lidar systems typically consist of a laser scanner, a GPS receiver, and an inertial measurement unit (IMU), all integrated into a single system. Here is how lidar works: Laser emission: A laser emits laser pulses, often in the form of rapid and repetitive laser...
Businessmachine visioncamerasartificial intelligenceFPGAlidarsmart cityMetavisionSensors & DetectorsPropheseeDigital Media ProfessionalsRestar ElectronicscollaborationpartnershipThe News Wire

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.