Abstract

Accurate detection of exercise fatigue based on physiological signals is vital for reasonable physical activity. Existing studies utilize widely Electrocardiogram (ECG) signals to achieve exercise monitoring. Nevertheless, ECG signals may be corrupted because of sweat or loose connection. As a non-invasive technique, Phonocardiogram (PCG) signals have a strong ability to reflect the Cardiovascular information, which is closely related to physical state. Therefore, a novel PCG-based detection method is proposed, where the feature fusion of deep learning features and linear features is the key technology of improving fatigue detection performance. Specifically, Short-Time Fourier Transform (STFT) is employed to convert 1D PCG signals into 2D images, and images are fed into the pre-trained convolutional neural network (VGG-16) for learning. Then, the fusion features are constructed by concatenating the VGG-16 output features and PCG linear features. Finally, the concatenated features are sent to Support Vector Machines (SVM) and Linear Discriminant Analysis (LDA) to distinguish six levels of exercise fatigue. The experimental results of two datasets show that the best performance of the proposed method achieves 91.47% and 99.00% accuracy, 91.49% and 99.09% F1-score, 90.99% and 99.07% sensitivity, which has comparable performance to an ECG-based system which is as gold standard (94.32% accuracy, 94.33% F1-score, 94.52% sensitivity).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.