Abstract
Ptychography is a form of coherent diffractive imaging; it employs far-field intensity patterns of the object to reconstruct the object. In ptychography, an important limiting factor for the reconstructed image quality is the uncertainty in the probe positions. Here, we propose a new approach which uses the hybrid input–output algorithm and cross-correlation in a way that can correct our estimates of the probe positions. The performance and limitations of the method in the presence of noise, varying overlap, and maximum recoverable error are studied using simulations. A brief comparison with other existing methods is also discussed here.
Highlights
Imaging an object—replacing lenses with algorithms—is becoming extremely popular
This will create a miss-match in the overlapped region between parts of the object O j-1(r) and O j (r) which correspond to neighbouring probe positions
We have proposed an alternative method to correct the probe positions in ptychography which is significantly different from the CC method
Summary
Imaging an object—replacing lenses with algorithms—is becoming extremely popular. We have proposed a new method based on gradient of intensity patterns [14] that can correct the probe positions with sub-pixel accuracy while being less computationally expensive than the CC method. This method combines the well known techniques HIO and cross-correlation in a way that it can correct probe positions in ptychography This is possible mainly due to two important properties:. If the probe positions were correct, the overlapped part of the object corresponding to neighbouring probe positions will coincide with each other (see figure 2) The performance of this method with varying parameters and its limitations are presented here.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.