Abstract
In recent years, depression recognition using physiological signals has achieved certain progress, but mild depression recognition is still in its infancy. Early detection can prevent the development of depression, and combining multiple modalities for analysis has been proved effective in the domain of mental disorders detection. Electroencephalogram (EEG) and eye movements (EM) are widely used to identify depression. However, the problem of using physiological signals to detect mental illness is that the generalization ability of the model is not strong, which is caused by individual differences. In view of the above problems, this paper proposes a hybrid fusion model based on deep belief network (DBN) and secondary classifier, called HFMBDSC, which first uses unsupervised DBN to fuse EEG features (linear, nonlinear features and network features) at the feature level. DBN transforms EEG features into another form to mitigate the effects of individual differences in EEG, and obtains DBN features that can more comprehensively represent EEG information. In the next step of decision level fusion, DBN features and EM features jointly make the final decision using the three classifiers that perform best when using single modality. The results show that DBN can improve the classification accuracy and effectively reduce the influence of individual differences in EEG features through visual analysis of feature space. The classification performance of HFMBDSC is significantly improved compared to the traditional single modality results. The highest accuracy rate of 89.54% is obtained under 10-fold cross-validation. These results suggest that mild depression recognition based on HFMBDSC is promising.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have