Abstract

Diagnosis of the esophageal motility disorders is ongoing in clinical evaluations, which is based on traditional method called High-resolution manometry (HRM). However, the huge raw swallow data sets from the HRM are not allowed the doctors to interpret and classify the patients with esophageal symptoms. To this end, modeling propagation between vigor is useful for recognizing esophageal contraction patterns in large-scale high-resolution manometries. In this paper, we learned a discriminative propagation between vigor using deep learning methods. Furthermore, we designed an efficient graph to incorporate contractile vigor propagation (CVP) that considers the contraction and pressure propagation between vigor in HRM images. Using an attention graph convolutional network (GAT), the edges in CVP can automatically learn the contraction patterns (i.e., local temporal information) trends in time series and propagation (i.e., global information) through the attention units. The attention mechanism layer leverages the short-term trend to improve the prediction accuracy. The quantitative experiments showed that the proposed method achieves high accuracy in esophageal contraction pattern recognition and demonstrates its effectiveness compared to existing traditional methods. We also visualized the learned vigor-specific propagation patterns in the contractile features, which show that the proposed CVP-GAT is able to develop interpretable propagation information for esophageal contraction pattern recognition.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.