Abstract

<span lang="EN-US">Breast cancer is detected by screening mammography wherein X-rays are used to produce images of the breast. Mammograms for screening can detect breast cancer early. This research focuses on the challenges of using multi-view mammography to diagnose breast cancer. By examining numerous perspectives of an image, an attention-based feature-integration mechanism (AFIM) model that concentrates on local abnormal areas associated with cancer and displays the essential features considered for evaluation, analyzing cross-view data. This is segmented into two views the bi-lateral attention module (BAM) module integrates the left and right activation maps for a similar projection is used to create a spatial attention map that highlights the impact of asymmetries. Here the module's focus is on data gathering through medio-lateral oblique (MLO) and bilateral craniocaudal (CC) for each breast to develop an attention module. The proposed AFIM model generates using spatial attention maps obtained from the identical image through other breasts to identify bilaterally uneven areas and</span><span lang="EN-US">class activation map (CAM) generated from two similar breast images to emphasize the feature channels connected to a single lesion in a breast. AFIM model may easily be included in ResNet-style architectures to develop multi-view classification models.</span>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.