Abstract

Building upon pre-trained ViT models, many advanced methods have achieved significant success in COVID-19 classification. Many scholars pursue better performance by increasing model complexity and parameters. While these methods can enhance performance, they also require extensive computational resources and extended training times. Additionally, the persistent challenge of overfitting, due to limited COVID-19 dataset sizes, remains a hurdle. To address these challenges, we proposed a novel method to optimize pre-trained transformer models for efficient COVID-19 classification with stochastic configuration networks (SCNs), referred to as OPT-CO. We proposed two optimization methods: sequential optimization (SeOp) and parallel optimization (PaOp), by incorporating optimizers in a sequential and parallel manner, respectively. Our method can enhance model performance without necessitating a significant parameter expansion. Additionally, we introduced OPT-CO-SCN to avoid overfitting problems through the adoption of random projection for head augmentation. The experiments were carried out to evaluate the performance of our proposed model based on two publicly available datasets. Based on the evaluation results, our method achieved superior, performance surpassing other state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call