Long-range horizontal path imaging through atmospheric turbulence is hampered by spatiotemporally randomly varying shifting and blurring of scene points in recorded imagery. Although existing software-based mitigation strategies can produce sharp and stable imagery of static scenes, it remains highly challenging to mitigate turbulence in scenes with moving objects such that they remain visible as moving objects in the output. In our work, we investigate if and how event (also called neuromorphic) cameras can be used for this challenge. We explore how the high temporal resolution of the event stream can be used to distinguish between the apparent motion due to turbulence and the actual motion of physical objects in the scene. We use this to propose an algorithm to reconstruct output image sequences in which the static background of the scene is mitigated for turbulence, while the moving objects in the scene are preserved. The algorithm is demonstrated on indoor experimental recordings of moving objects imaged through artificially generated turbulence.
Read full abstract