Sony Semiconductor Solutions has announced the upcoming release of two types of stacked event-based vision sensors. The industrial sensors, with a 4.86μm pixel pitch, are the result of work with Prophesee presented last year at the International Solid-State Circuits Conference.
The sensors have a dynamic range of 86dB, and resolutions of 1,280 x 720 pixels (IMX636) and 640 x 512 pixels (IMX637).
Event-based vision sensors asynchronously detect luminance changes for each pixel and output the changed data. This is combined with information on pixel position (xy coordinates) and time, thereby enabling high-speed, low latency data output.
This is different to standard frame-based imaging, where the entire image is output at certain intervals determined by the frame rate.
The sensors are based on Sony’s stacked structure using Cu-Cu connection to achieve conduction between the pixel chip and the logic chip. The logic chip is equipped with a signal processing circuit for detecting luminance changes, for each pixel.
The sensors are equipped with event filtering functions developed by Prophesee for eliminating unnecessary event data. Using these filters helps eliminate events that aren't needed for the recognition task, such as the LED flickering that can occur at certain frequencies (anti-flicker), as well as events that are highly unlikely to be the outline of a moving subject (event filter).
The filters also make it possible to adjust the volume of data when necessary to ensure it falls below the event rate that can be processed in downstream systems (event rate control).
Prophesee’s Metavision Intelligence Suite is available for application development and provides solutions for various use cases. Prophesee has also released an evaluation kit for developing applications with the sensors.