Abstract

The existing methods of contact respiration rate (RR) measurement can bring discomfort to the person being measured. However, RR is a human index that has to be monitored in clinical medicine. To overcome the limitations of contact measurement methods, a non-contact RR measurement method based on an infrared thermal camera is proposed. This is based on the phenomenon that human breathing causes periodic temperature changes around the nostrils. First, an infrared thermal camera is used to collect image sequences of the human face. And then, to track a region-of-interest (ROI) in moving image sequences, the You Only Look Once V3 deep learning method is used to track the nostril ROI from the recorded facial image sequences. The performance of the deep learning model is analyzed through experiments. The average temperature of the nostril ROI of the tracked image is calculated, and the temperature change value of the continuous image series can be obtained. The temperature change curve reflects the respiratory process. A Butterworth low-pass filter is used to filter the continuous temperature value to obtain robust respiratory signals, to improve the signal-noise ratio. Last, the respiratory rate is calculated by time-domain and frequency-domain methods. RR is calculated in different conditions, such as normal respiration, fast respiration, and slow respiration. Through the training, the training accuracy of the model reaches 97.9%. Experimental results show that the method can achieve high precision, and the maximum error is not more than 2%. Those indicate the proposed method can effectively measure the RR. Therefore, the proposed non-contact RR measurement method can be a useful reference for clinical RR measurement and other applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.