Background and purposeConvolutional neural networks (CNNs) have comparable human level performance in automatic segmentation. An important challenge that CNNs face in segmentation is catastrophic forgetting. They lose performance on tasks that were previously learned when trained on task. In this study, we propose a lifelong learning method to learn multiple segmentation tasks continuously without forgetting previous tasks. Materials and methodsThe cohort included three tumors, 800 patients of which had nasopharyngeal cancer (NPC), 800 patients had breast cancer, and 800 patients had rectal cancer. The tasks included segmentation of the clinical target volume (CTV) of these three cancers. The proposed lifelong learning network adopted dilation adapter to learn three segmentation tasks one by one. Only the newly added dilation adapter (seven layers) was fine tuning for incoming new task, whereas all the other learned layers were frozen. ResultsCompared with single-task, multi-task or transfer learning, the proposed lifelong learning can achieve better or comparable segmentation accuracy with a DSC of 0.86 for NPC, 0.89 for breast cancer, and 0.87 for rectal cancer. Lifelong learning can avoid forgetting in sequential learning and yield good performance with less training data. Furthermore, it is more efficient than single-task or transfer learning, which reduced the number of parameters, size of model, and training time by ~58.8%, ~55.6%, and ~25.0%, respectively. ConclusionThe proposed method preserved the knowledge of previous tasks while learning a new one using a dilation adapter. It could yield comparable performance with much less training data, model parameters, and training time.
Read full abstract