Abstract

Breast cancer is one of the most frequently diagnosed solid cancers. Mammography is the most commonly used screening technology for detecting breast cancer. Traditional machine learning methods of mammographic image classification or segmentation using manual features require a great quantity of manual segmentation annotation data to train the model and test the results. But manual labeling is expensive, time-consuming, and laborious, and greatly increases the cost of system construction. To reduce this cost and the workload of radiologists, an end-to-end full-image mammogram classification method based on deep neural networks was proposed for classifier building, which can be constructed without bounding boxes or mask ground truth label of training data. The only label required in this method is the classification of mammographic images, which can be relatively easy to collect from diagnostic reports. Because breast lesions usually take up a fraction of the total area visualized in the mammographic image, we propose different pooling structures for convolutional neural networks(CNNs) instead of the common pooling methods, which divide the image into regions and select the few with high probability of malignancy as the representation of the whole mammographic image. The proposed pooling structures can be applied on most CNN-based models, which may greatly improve the models' performance on mammographic image data with the same input. Experimental results on the publicly available INbreast dataset and CBIS dataset indicate that the proposed pooling structures perform satisfactorily on mammographic image data compared with previous state-of-the-art mammographic image classifiers and detection algorithm using segmentation annotations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.