Abstract

In contrast to previous studies that focused on classical machine learning algorithms and hand-crafted features, we present an end-to-end neural network classification method able to accommodate lesion heterogeneity for improved oral cancer diagnosis using multispectral autofluorescence lifetime imaging (maFLIM) endoscopy. Our method uses an autoencoder framework jointly trained with a classifier designed to handle overfitting problems with reduced databases, which is often the case in healthcare applications. The autoencoder guides the feature extraction process through the reconstruction loss and enables the potential use of unsupervised data for domain adaptation and improved generalization. The classifier ensures the features extracted are task-specific, providing discriminative information for the classification task. The data-driven feature extraction method automatically generates task-specific features directly from fluorescence decays, eliminating the need for iterative signal reconstruction. We validate our proposed neural network method against support vector machine (SVM) baselines, with our method showing a 6.5%-8.3% increase in sensitivity. Our results show that neural networks that implement data-driven feature extraction provide superior results and enable the capacity needed to target specific issues, such as inter-patient variability and the heterogeneity of oral lesions.Clinical relevance- We improve standard classification algorithms for in vivo diagnosis of oral cancer lesions from maFLIm for clinical use in cancer screening, reducing unnecessary biopsies and facilitating early detection of oral cancer.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.