Abstract

Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light. By mapping a high-dimensional datacube of incident light onto a 2D image sensor, multidimensional photography resolves the scene along with other information dimensions, such as wavelength and time. However, the application of current multidimensional imagers is fundamentally restricted by their static optical architectures and measurement schemes—the mapping relation between the light datacube voxels and image sensor pixels is fixed. To overcome this limitation, we propose tunable multidimensional photography through active optical mapping. A high-resolution spatial light modulator, referred to as an active optical mapper, permutes and maps the light datacube voxels onto sensor pixels in an arbitrary and programmed manner. The resultant system can readily adapt the acquisition scheme to the scene, thereby maximising the measurement flexibility. Through active optical mapping, we demonstrate our approach in two niche implementations: hyperspectral imaging and ultrafast imaging.

Highlights

  • Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light

  • The primary challenge for multidimensional photography is to enable the measurement with only evolutionary changes to the standard image sensors, where the pixels are typically arranged in the 2D format

  • The lack of tunability, limits the range of applicability of current multidimensional imagers in demanding tasks. To address this unmet need, we developed a tunable snapshot multidimensional photography approach through active optical mapping

Read more

Summary

Introduction

Multidimensional photography can capture optical fields beyond the capability of conventional image sensors that measure only two-dimensional (2D) spatial distribution of light. By mapping a high-dimensional datacube of incident light onto a 2D image sensor, multidimensional photography resolves the scene along with other information dimensions, such as wavelength and time. The application of current multidimensional imagers is fundamentally restricted by their static optical architectures and measurement schemes—the mapping relation between the light datacube voxels and image sensor pixels is fixed. To overcome this limitation, we propose tunable multidimensional photography through active optical mapping. While the colour and angular information of light are averaged in the photon–electron conversion, the fast temporal information is lost during the pixel exposure and readout Breaking these limitations and measuring photon tags in parallel has been the holy grail of multidimensional photography over the past several decades. Rather than exchanging a spatial axis for other light information, snapshot multidimensional imagers adopt more complicated methods to map light datacube voxels to a 2D detector array for simultaneous measurement

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call