Abstract

As the community progresses towards automated Universal Lesion Detection (ULD), it is vital that the techniques developed are robust and easily adaptable across a variety of datasets coming from different scanners, hospitals, and acquisition protocols. In practice, this remains a challenge due to the complexities of the different types of domain shifts. In this paper, we address the domain-shift by proposing a novel domain adaptation framework for ULD. The proposed model allows for the transfer of lesion knowledge from a large labeled source domain to detect lesions on a new target domain with minimal labeled samples. The proposed method first aligns the feature distribution of the two domains by training a detector on the source domain using a supervised loss, and a discriminator on both source and unlabeled target domains using an adversarial loss. Subsequently, a few labeled samples from the target domain along with labeled source samples are used to adapt the detector using an over-fitting aware and periodic gradient update based joint few-shot fine-tuning technique. Further, we utilize a self-supervision scheme to obtain pseudo-labels having high-confidence on the unlabeled target domain which are used to further train the detector in a semi-supervised manner and improve the detection sensitivity. We evaluate our proposed approach on domain adaptation for lesion detection from CT-scans wherein a ULD network trained on the DeepLesion dataset is adapted to 3 target domain datasets such as LiTS, KiTS and 3Dircadb. By utilizing adversarial, few-shot and incremental semi-supervised training, our method achieves comparable detection sensitivity to the previous methods for few-shot and semi-supervised methods as well as to the Oracle model trained on the labeled target domain.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.