Abstract

Cross-site datasets expand the data size and could improve the disease diagnosis capabilities of machine learning models. However, differences in data distribution between different sites can lead to poor model generalizability. Although transfer learning is a mainstream method often used to tackle the issue, most transfer learning studies assume that all the target samples are given in the training procedure, which is not available in clinical applications where the target samples arrive sequentially. Online transfer learning (OTL) aims to accomplish clinical diagnostic tasks by adaptively updating the ensemble model containing source and target classifiers. However, OTL is limited by zero initialization and requires numbers of samples to iterate. This results in the underperformance of OTL in clinical applications where few samples can be obtained. In this article, we propose a new framework named few-shot synthetic OTL (FSOTL) to address this issue. FSOTL uses synthetic data to warm up the model in an online fashion. It not only alleviates the problem of scarcity of samples in the target domain but also enables the model to gain more knowledge. Our experiments show that FSOTL performs more stably and achieves more accurate results with few target samples, thereby offering a promising cross-site online computer-aided diagnosis system for large-scale applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call