Abstract
Deep learning-based approaches for automatic depression recognition offer advantages of low cost and high efficiency. However, depression symptoms are challenging to detect and vary significantly between individuals. Traditional deep learning methods often struggle to capture and model these nuanced features effectively, leading to lower recognition accuracy. This paper introduces a novel multimodal depression recognition method, HYNMDR, which utilizes hypergraphs to represent the complex, high-order relationships among patients with depression. HYNMDR comprises two primary components: a temporal embedding module and a hypergraph classification module. The temporal embedding module employs a temporal convolutional network and a negative sampling loss function based on Euclidean distance to extract feature embeddings from unimodal and cross-modal long-time series data. To capture the unique ways in which depression may manifest in certain feature elements, the hypergraph classification module introduces a threshold segmentation-based hyperedge construction method. This method is the first attempt to apply hypergraph neural networks to multimodal depression recognition. Experimental evaluations on the DAIC-WOZ and E-DAIC datasets demonstrate that HYNMDR outperforms existing methods in automatic depression monitoring, achieving an F1 score of 91.1% and an accuracy of 94.0%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.