Abstract

PurposeVisual reading of 18F-florbetapir positron emission tomography (PET) scans is used in the diagnostic process of patients with cognitive disorders for assessment of amyloid-ß (Aß) depositions. However, this can be time-consuming, and difficult in case of borderline amyloid pathology. Computer-aided pattern recognition can be helpful in this process but needs to be validated. The aim of this work was to develop, train, validate and test a convolutional neural network (CNN) for discriminating between Aß negative and positive 18F-florbetapir PET scans in patients with subjective cognitive decline (SCD).Methods18F-florbetapir PET images were acquired and visually assessed. The SCD cohort consisted of 133 patients from the SCIENCe cohort and 22 patients from the ADNI database. From the SCIENCe cohort, standardized uptake value ratio (SUVR) images were computed. From the ADNI database, SUVR images were extracted. 2D CNNs (axial, coronal and sagittal) were built to capture features of the scans. The SCIENCe scans were randomly divided into training and validation set (5-fold cross-validation), and the ADNI scans were used as test set. Performance was evaluated based on average accuracy, sensitivity and specificity from the cross-validation. Next, the best performing CNN was evaluated on the test set.ResultsThe sagittal 2D-CNN classified the SCIENCe scans with the highest average accuracy of 99% ± 2 (SD), sensitivity of 97% ± 7 and specificity of 100%. The ADNI scans were classified with a 95% accuracy, 100% sensitivity and 92.3% specificity.ConclusionThe 2D-CNN algorithm can classify Aß negative and positive 18F-florbetapir PET scans with high performance in SCD patients.

Highlights

  • Methods and materialsPatients with subjective cognitive decline (SCD) are at increased risk for developing mild cognitive impairment (MCI), Alzheimer’s disease (AD) or other types of dementia [1, 2]

  • convolutional neural network (CNN) have been effectively applied in 18F-FDG-positron emission tomography (PET) neurodegeneration studies to discriminate between diagnostic groups and identify the patterns related to AD progression without the use of pre-defined volumes of interest (VOIs) [9]

  • Between the two qualified readers, no differences in visual assessment of the Alzheimer’s Disease Neuroimaging Initiative (ADNI) test data exist. For this dataset the sagittal model classified with an accuracy of 95.0%, sensitivity of 100.0% and specificity of 92.3%. For this dataset an average confidence score of 4.6 ± 0.6 was given by the two qualified 18F-florbetapir readers and the sagittal CNN scored the scans with an average probability of 0.95 ± 0.04

Read more

Summary

Introduction

Methods and materialsPatients with subjective cognitive decline (SCD) are at increased risk for developing mild cognitive impairment (MCI), Alzheimer’s disease (AD) or other types of dementia [1, 2]. Various computer-aided pattern recognition algorithms have been developed to evaluate and identify PET patterns associated with specific disease stages, based on 18F-fluoro-deoxyglucose (18F-FDG) brain PET images [7, 8]. These studies applied machine learning approaches using atlas-based anatomical volumes of interest (VOIs) for feature extraction to classify AD progression in PET images. In case of patients with SCD, feature extraction can be more difficult than in AD patients, since Aß depositions can be subtle in relation to non-specific background uptake It is of interest whether CNN could effectively be applied in 18F-florbetapir PET studies in patients with SCD

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.