Abstract

Autism spectrum disorder (ASD) is a condition observed in children who display abnormal patterns of interaction, behavior, and communication with others. Despite extensive research efforts, the underlying causes of this neurodevelopmental disorder and its biomarkers remain unknown. However, advancements in artificial intelligence and machine learning have improved clinicians' ability to diagnose ASD. This review paper investigates various MRI modalities to identify distinct features that characterize individuals with ASD compared to typical control subjects. The review then moves on to explore deep learning models for ASD diagnosis, including convolutional neural networks (CNNs), autoencoders, graph convolutions, attention networks, and other models. CNNs and their variations are particularly effective due to their capacity to learn structured image representations and identify reliable biomarkers for brain disorders. Computer vision transformers often employ CNN architectures with transfer learning techniques like fine-tuning and layer freezing to enhance image classification performance, surpassing traditional machine learning models. This review paper contributes in three main ways. Firstly, it provides a comprehensive overview of a recommended architecture for using vision transformers in the systematic ASD diagnostic process. To this end, the paper investigates various pre-trained vision architectures such as VGG, ResNet, Inception, InceptionResNet, DenseNet, and Swin models that were fine-tuned for ASD diagnosis and classification. Secondly, it discusses the vision transformers of 2020th like BiT, ViT, MobileViT, and ConvNeXt, and applying transfer learning methods in relation to their prospective practicality in ASD classification. Thirdly, it explores brain transformers that are pre-trained on medically rich data and MRI neuroimaging datasets. The paper recommends a systematic architecture for ASD diagnosis using brain transformers. It also reviews recently developed brain transformer-based models, such as METAFormer, Com-BrainTF, Brain Network, ST-Transformer, STCAL, BolT, and BrainFormer, discussing their deep transfer learning architectures and results in ASD detection. Additionally, the paper summarizes and discusses brain-related transformers for various brain disorders, such as MSGTN, STAGIN, and MedTransformer, in relation to their potential usefulness in ASD. The study suggests that developing specialized transformer-based models, following the success of natural language processing (NLP), can offer new directions for image classification problems in ASD brain biomarkers learning and classification. By incorporating the attention mechanism, treating MRI modalities as sequence prediction tasks trained on brain disorder classification problems, and fine-tuned on ASD datasets, brain transformers can show a great promise in ASD diagnosis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call