Abstract
Ptychography is an imaging method whereby a coherent beam is scanned across an object, and an image is obtained by iterative phasing of the set of diffraction patterns. It is able to be used to image extended objects at a resolution limited by scattering strength of the object and detector geometry, rather than at an optics-imposed limit. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes, yet at the same time there is also a need to deliver reconstructed images immediately so that one can evaluate the next steps to take in an experiment. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs) and then employs novel techniques to merge sub-datasets into a single complex phase and amplitude image. Results are shown on a simulated specimen and a real dataset from an X-ray experiment conducted at a synchrotron light source.
Highlights
Traditional microscopes measure changes in specimen optical response via a lens-based direct mapping to pixels in a detector
While one can exceed this limit by using special properties of certain fluorophores in visible light microscopy [2,3,4,5], another approach is to record far-field diffraction patterns without optics-imposed resolution limits and use them to recover the structure of the specimen
In addition to the simulated test case, we evaluate the performance on real data acquired at a synchrotron radiation facility; to validate the applicability and performance with domain scientists, and because such data is usually compromised with noise resulting from thermal drifts in the motorized stages, fluctuations in the beam intensity, and distortions in the diffraction patterns caused by air scattering, changes in sample temperature, and bad or missing detector pixels
Summary
Traditional microscopes measure changes in specimen optical response via a lens-based direct mapping to pixels in a detector. Far-field diffraction data are the squared modulus of the Fourier transform of the sample’s exit wave function. This gives rise to the well-known phase problem, in that the (unmeasured) Fourier phases are needed for direct Fourier inversion to produce an image. Applying a set of constraints in both real and reciprocal spaces typically leads to a convergence to a solution that aligns the acquired data with the a priori constraints [12] In this regard, the reciprocal space is the domain in which the Fourier transform of the specimen’s two-dimensional spatial function (real space) is represented. Extending ptychography to 3D comes at the expense of an increase in the computational and memory requirements of the phase retrieval software
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.