Brain disorders pose a substantial global health challenge, persisting as a leading cause of mortality worldwide. Electroencephalogram (EEG) analysis is crucial for diagnosing brain disorders, but it can be challenging for medical practitioners to interpret complex EEG signals and make accurate diagnoses. To address this, our study focuses on visualizing complex EEG signals in a format easily understandable by medical professionals and deep learning algorithms. We propose a novel time–frequency (TF) transform called the Forward–Backward Fourier transform (FBFT) and utilize convolutional neural networks (CNNs) to extract meaningful features from TF images and classify brain disorders. We introduce the concept of eye-naked classification, which integrates domain-specific knowledge and clinical expertise into the classification process. Our study demonstrates the effectiveness of the FBFT method, achieving impressive accuracies across multiple brain disorders using CNN-based classification. Specifically, we achieve accuracies of 99.82% for epilepsy, 95.91% for Alzheimer’s disease (AD), 85.1% for murmur, and 100% for mental stress using CNN-based classification. Furthermore, in the context of naked-eye classification, we achieve accuracies of 78.6%, 71.9%, 82.7%, and 91.0% for epilepsy, AD, murmur, and mental stress, respectively. Additionally, we incorporate a mean correlation coefficient (mCC) based channel selection method to enhance the accuracy of our classification further. By combining these innovative approaches, our study enhances the visualization of EEG signals, providing medical professionals with a deeper understanding of TF medical images. This research has the potential to bridge the gap between image classification and visual medical interpretation, leading to better disease detection and improved patient care in the field of neuroscience.