Abstract
Background and Objective: Physiological time series are common data sources in many health applications. Mining data from physiological time series is crucial for promoting healthy living and reducing governmental medical expenditure. Recently, research and applications of deep learning methods on physiological time series have developed rapidly because such data can be continuously recorded by smart wristbands or smartwatches. However, existing deep learning methods suffer from excessive model complexity and a lack of explanation. This paper aims to handle these issues.Methods: We propose TEG-net, which is a novel deep learning method for accurately diagnosing and explaining physiological time series. TEG-net constructs T-net (a multi-scale bi-directional temporal convolutional neural network) to model physiological time series directly, E-net (personalized linear model) to model expert features extracted from physiological time series, and G-net (gating neural network) to combine T-net and E-net for diagnosis. The combination of T-net and E-net through G-net improves diagnosis accuracy and E-net can be utilized for explanation.Results: Experimental results demonstrate that TEG-net outperforms the second-best baseline by 13.68% in terms of area under the receiver operating characteristic curve and 11.49% in terms of area under the precision-recall curve. Additionally, intuitive justifications can be provided to explain model predictions.Conclusions: This paper develops an ensemble method to combine expert features and deep learning method for modeling physiological time series. Improvements in diagnostic accuracy and explanation make TEG-net applicable to many real-world health applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.