Abstract

Breathing conditions affect a wide range of people, including those with respiratory issues like asthma and sleep apnea. Smartwatches with photoplethysmogram (PPG) sensors can monitor breathing. However, current methods have limitations due to manual parameter tuning and pre-defined features. To address this challenge, we propose the PPG2RespNet deep-learning framework. It draws inspiration from the UNet and UNet + + models. It uses three publicly available PPG datasets (VORTAL, BIDMC, Capnobase) to autonomously and efficiently extract respiratory signals. The datasets contain PPG data from different groups, such as intensive care unit patients, pediatric patients, and healthy subjects. Unlike conventional U-Net architectures, PPG2RespNet introduces layered skip connections, establishing hierarchical and dense connections for robust signal extraction. The bottleneck layer of the model is also modified to enhance the extraction of latent features. To evaluate PPG2RespNet's performance, we assessed its ability to reconstruct respiratory signals and estimate respiration rates. The model outperformed other models in signal-to-signal synthesis, achieving exceptional Pearson correlation coefficients (PCCs) with ground truth respiratory signals: 0.94 for BIDMC, 0.95 for VORTAL, and 0.96 for Capnobase. With mean absolute errors (MAE) of 0.69, 0.58, and 0.11 for the respective datasets, the model exhibited remarkable precision in estimating respiration rates. We used regression and Bland-Altman plots to analyze the predictions of the model in comparison to the ground truth. PPG2RespNet can thus obtain high-quality respiratory signals non-invasively, making it a valuable tool for calculating respiration rates.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.