Abstract

Optical Scanning Holography (OSH) is a powerful technique that employs a single-pixel sensor and a row-by-row scanning mechanism to capture the hologram of a wide-view, three-dimensional object. However, the time required to acquire a hologram with OSH is rather lengthy. In this paper, we propose an enhanced framework, which is referred to as Adaptive OSH (AOSH), to shorten the holographic recording process. We have demonstrated that the AOSH method is capable of decreasing the acquisition time by up to an order of magnitude, while preserving the content of the hologram favorably.

Highlights

  • Optical Scanning Holography (OSH) is a powerful technique that employs a single-pixel sensor and a row-by-row scanning mechanism to capture the hologram of a wide-view, three-dimensional object

  • We applied our proposed Adaptive OSH (AOSH) method to capture the hologram of object “A”, based on ∆MIN = 6 and ∆S = 6, so that the maximum and minimum separation between scan lines are within the range [∆MIN, ∆S + ∆MIN ] = [6, 12]

  • As the name has implied, AOSH is an enhancement on the classical optical scanning holography technique

Read more

Summary

OPEN Adaptive Optical Scanning

An effective solution to these problems have been envisioned by Poon and Korpel in the 70’s6 with a method known as Optical Scanning Holography (OSH)[7], which is capable of capturing holograms of wide-view object scenes. A straightforward way to lower the hologram capturing time in OSH is to increase the spacing between the scan lines along the vertical direction[13]. The factors ∆MIN and ∆S provides a tradeoff between the number of rows to be scanned (which determines the scanning speed proportionally), and the quality of the hologram If these 2 factors are large, fewer hologram lines will be scanned, resulting in faster capturing time and more degradation on the hologram. Reconstructed image AOSH “A” [Fig. 7a] AOSH “B”: focused on the character “Light” [Fig. 7b] AOSH “B”: focused on the character “Electricity” [Fig. 7c]

Experimental Results
Conclusion
Author Contributions
Additional Information
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.