Abstract

We evaluate utilizing convolutional neural networks (CNNs) to optimally fuse parenchymal complexity measurements generated by texture analysis into discriminative meta-features relevant for breast cancer risk prediction. With Institutional Review Board approval and Health Insurance Portability and Accountability Act compliance, we retrospectively analyzed "For Processing" contralateral digital mammograms (GE Healthcare 2000D/DS) from 106 women with unilateral invasive breast cancer and 318 age-matched controls. We coupled established texture features (histogram, co-occurrence, run-length, structural), extracted using a previously validated lattice-based strategy, with a multichannel CNN into a hybrid framework in which a multitude of texture feature maps are reduced to meta-features predicting the case or control status. We evaluated the framework in a randomized split-sample setting, using the area under the curve (AUC) of the receiver operating characteristic (ROC) to assess case-control discriminatory capacity. We also compared the framework to CNNs directly fed with mammographic images, as well as to conventional texture analysis, where texture feature maps are summarized via simple statistical measures that are then used as inputs to a logistic regression model. Strong case-control discriminatory capacity was demonstrated on the basis of the meta-features generated by the hybrid framework (AUC = 0.90), outperforming both CNNs applied directly to raw image data (AUC = 0.63, P <.05) and conventional texture analysis (AUC = 0.79, P <.05). Our results suggest that informative interactions between patterns exist in texture feature maps derived from mammographic images, which can be extracted and summarized via a multichannel CNN architecture toward leveraging the associations of textural measurements to breast cancer risk.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call