Abstract

Recently, image sensors for mobile devices have shrunk in pixel size and increased in resolution, while there is a strong demand for low-light photography and high-frame-rate video recording with low power. Because smaller pixels result in reduced sensitivity, the exposure time must be increased. However, when capturing fast-moving objects with longer exposure times, blurry images are unavoidable. To solve this problem, bracketing and burst imaging have recently been reported [1]. These techniques combine multiple RGB frames into a single frame. To capture high-speed objects with the same exposure time, the number of frames and output frame rate are increased. As a result, the power consumption increases significantly. Conversely, event-based vision sensors (EVSs) have been proposed as a more efficient means to capture motion information than the existing frame-based RGB sensors [2], [3]. The possibility of effective image enhancement such as deblur and video frame interpolation using a deep neural network (DNN) with RGB and event data have been reported [4], [5]. To satisfy the required image quality, these image-enhancement techniques need RGB characteristics equivalent to advanced mobile RGB sensors and focal alignment on a sensor between RGB and event pixels. To address these issues, this paper proposes a hybrid-type sensor embedding high-frame-rate event pixels with advanced mobile <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$1.22\mu \mathrm{m}$</tex> RGB pixels in an existing mobile image sensor architecture. To allow image enhancement of the hybrid data in the mobile application processor, efficient packaging of the data in frames and accurate synchronization between RGB and EVS data are required. Therefore, the EVS part of the sensor uses a scan readout access and global detection. The scan readout type EVS has the problem that the frame rate decreases as the number of pixels increases, owing to the limited access speed of event rows and limited bandwidth of the sensor output interface. Therefore, a variable frame rate, which changes the frame rate according to the number of detected events, as well as an event drop filter and event compression to reduce and control the amount of detected data are adopted.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.