Abstract
The conventional diagnostic process and tools of cardiovascular autonomic neuropathy (CAN) can easily identify the two main categories of the condition: severe/definite CAN and normal/healthy without CAN. Conventional techniques encounter significant challenges when identifying CAN in its early or atypical stages due to the inherent imbalanced and incompleteness condition in the collected clinical multimodal data, including electrocardiogram (ECG) data from ECG sensors, blood chemistry, podiatry, and endocrinology features. Therefore, most detection tools and techniques are limited to binary CAN classification. However, early diagnosis of CAN or diagnosis of the atypical stages of CAN is more important than the diagnosis of severe CAN, which, in fact, is easily identifiable with a few diagnostic reports. In this paper, we propose a novel multi-class classification approach for timely CAN detection. The proposed classification algorithm develops a multistage fusion model by combining feature selection and multimodal feature fusion techniques. The proposed method develops a performance criterion-based feature selection technique to guarantee highly significant features. A multimodal feature fusion technique was developed using deep learning feature fusion and selected original features. The experimental results obtained from testing with a large CAN dataset indicate that the proposed algorithm significantly improved the diagnostic accuracy of CAN compared to conventional Ewing battery features. The algorithm also identified the early or atypical stages of CAN with an AUC score of 0.931 using leave-one-out cross-validation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.