Abstract

In single-particle electron microscopy, the cross correlation method has been used widely for aligning noisy images from a given class. In this method, aligned images are averaged to generate a high signal-to-noise SNR) representative for the class. The traditional cross correlation method searches a large space of rotations and translations since information about the alignment is not known a priori. Furthermore, that method requires an initial reference image to be provided by the user. In contrast, in our new version of the cross correlation method, the size of the search space is reduced by preprocessing class images as described below, and this peprocessing step circumvents the need for a reference image. During preprocessing, the centers of mass and the principal axes of images within a class are aligned, resulting in a blurred version of the underlying image. This blurry image is then used in place of an initial reference image. Even though the initial alignment is coarse, the statistics of the resulting misalignment can be estimated well based on the ergodic properties of the additive background noise. Using the statistical properties of the misalignments, a targeted search within the set of all translations and rotations of images is performed, resulting in reduced computational time and increased alignment accuracy. Using synthetic data, we compare the new method to both the classical cross correlation approach and the maximum likelihood method, and demonstrate the improvement in performance that results when using our method. This work was supported by NIH Grant R01GM075310.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.