Sony Unveiled IMX636 and IMX637: Smallest Two Types of Stacked Event-based Vision Sensors

Sony IMX636 and IMX637: Stacked Event-based Vision Sensors

Sony IMX636 and IMX637: Stacked Event-based Vision Sensors

Sony Semiconductor today released two of the industry’s smallest pixel size stacked event-based vision sensors – Sony IMX636 and Sony IMX637. These sensors are designed for industrial equipment, can focus on monitoring object-specific changes, and achieve the industry’s smallest 4.86μm pixel size in similar products.

Left: IMX636 | Right: IMX637

Officials said the sensors can imitate the human eye working mechanism to empower a new machine vision, in different environments and conditions, high speed, high precision detection of specific changes in moving objects, can be applied to industrial equipment predictive maintenance, motion capture analysis, video capture, smart city, etc.

According to the official description, the vision sensor for event monitoring detects each pixel’s luminance change asynchronously (not simultaneously) and outputs only the data of the change, combining it with pixel position (XY coordinates) and temporal information to achieve high speed, low latency data output.

Sony Event-based Vision Sensors

The two new sensors are stacked and utilize Sony’s proprietary Cu-Cu connection technology to achieve the industry’s smallest pixel size of 4.86μm. In addition to operating at low power consumption and achieving high speed, low latency, and high temporal resolution data output, the two sensors feature small size but ultra-high resolution. Combine all these advantages to ensure immediate detection of moving objects in different environments and situations.

The pixel part of these two sensors and the bottom logic circuit part, using copper solder pads for electrical connection. Compared to previous silicon through-hole (TSV) wiring, this approach provides greater design freedom, improves production efficiency, helps reduce the size, and improves performance. The two sensors, developed in collaboration with Sony and Prophesee’s unique event-based vision sensing technology. Which enables high speed, high accuracy data acquisition, and helps improve productivity in industrial equipment. Below are some examples:

1. Vibration detection: frame-based image
1. Vibration detection: event-based sensing
2. Spark detection during metal cutting: frame-based image
2. Spark detection during metal cutting: event-based sensing

These sensors are equipped with event filters developed by Prophesee to eliminate unnecessary event data and make them suitable for various applications. Using these filters helps to eliminate events that should not be captured by the current recognition task, such as LED flicker that may occur at certain frequencies (anti-flicker), and events that are highly unlikely to be moving object profiles.

Below is an Image with event data accumulated for an equivalent of a single frame at 30 fps (approx. 33 ms). Approx92% data volume reduction from picture A).

Picture A: Event filter off
Picture B: Event filter on

Source

Exit mobile version