Abstract

Automated registration algorithms for a pair of 2D X-ray mammographic images taken from two standard imaging angles, namely, the craniocaudal (CC) and the mediolateral oblique (MLO) views, are developed. A fully convolutional neural network, a type of convolutional neural network (CNN), is employed to generate a pixel-level deformation field, which provides a mapping between masses in the two views. Novel distance-based regularization is employed, which contributes significantly to the performance. The developed techniques are tested using real 2D mammographic images, slices from real 3D mammographic images, and synthetic mammographic images. Architectural variations of the neural network are investigated and the performance is characterized from various aspects including image resolution, breast density, lesion size, lesion subtlety, and lesion Breast Imaging-Reporting and Data System (BI-RADS) category. Our network outperformed the state-of-the-art CNN-based and non-CNN-based registration techniques, and showed robust performance across various tissue/lesion characteristics. The proposed methods provide a useful automated tool for co-locating lesions between the CC and MLO views even in challenging cases. Our methods can aid clinicians to establish lesion correspondence quickly and accurately in the dual-view X-ray mammography, improving diagnostic capability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.