Abstract

Lung cancer is the leading cancer type that causes mortality in both men and women. Computer-aided detection (CAD) and diagnosis systems can play a very important role for helping physicians with cancer treatments. This study proposes a hierarchical deep-fusion learning scheme in a CAD framework for the detection of nodules from computed tomography (CT) scans. In the proposed hierarchical approach, a decision is made at each level individually employing the decisions from the previous level. Further, individual decisions are computed for several perspectives of a volume of interest. This study explores three different approaches to obtain decisions in a hierarchical fashion. The first model utilizes raw images. The second model uses a single type of feature image having salient content. The last model employs multi-type feature images. All models learn the parameters by means of supervised learning. The proposed CAD frameworks are tested using lung CT scans from the LIDC/IDRI database. The experimental results showed that the proposed multi-perspective hierarchical fusion approach significantly improves the performance of the classification. The proposed hierarchical deep-fusion learning model achieved a sensitivity of 95% with only 0.4 fp/scan.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call