Lung cancer is a leading cause of death worldwide, making early and accurate diagnosis essential for improving patient outcomes. Recently, deep learning (DL) has proven to be a powerful tool, significantly enhancing the accuracy of computer-aided pulmonary nodule detection (PND). In this study, we introduce a novel approach called the Omni-dimension Dynamic Residual 3D Net (ODR3DNet) for PND, which utilizes full-dimensional dynamic 3D convolution, along with a specialized machine learning algorithm for detecting lung nodules in 3D point clouds. The primary goal of ODR3DNet is to overcome the limitations of conventional 3D Convolutional Neural Networks (CNNs), which often struggle with adaptability and have limited feature extraction capabilities. Our ODR3DNet algorithm achieves a high CPM (Competition Performance Metric) score of 0.885, outperforming existing mainstream PND algorithms and demonstrating its effectiveness. Through detailed ablation experiments, we confirm that the OD3D module plays a crucial role in this performance boost and identify the optimal configuration for the algorithm. Moreover, we developed a dedicated machine learning detection algorithm tailored for lung 3D point cloud data. We outline the key steps for reconstructing the lungs in 3D and establish a comprehensive process for building a lung point cloud dataset, including data preprocessing, 3D point cloud conversion, and 3D volumetric box annotation. Experimental results validate the feasibility and effectiveness of our proposed approach.
Read full abstract