Abstract

The aim of electrocardiogram (ECG) compression is to reduce the amount of data as much as possible while preserving the significant information for diagnosis. Objective metrics that are derived directly from the signal are suitable for controlling the quality of the compressed ECGs in practical applications. Many approaches have employed figures of merit based on the percentage root mean square difference (PRD) for this purpose. The benefits and drawbacks of the PRD measures, along with other metrics for quality assessment in ECG compression, are analysed in this work. We propose the use of the root mean square error (RMSE) for quality control because it provides a clearer and more stable idea about how much the retrieved ECG waveform, which is the reference signal for establishing diagnosis, separates from the original. For this reason, the RMSE is applied here as the target metric in a thresholding algorithm that relies on the retained energy. A state of the art compressor based on this approach, and its PRD-based counterpart, are implemented to test the actual capabilities of the proposed technique. Both compression schemes are employed in several experiments with the whole MIT–BIH Arrhythmia Database to assess both global and local signal distortion. The results show that, using the RMSE for quality control, the distortion of the reconstructed signal is better controlled without reducing the compression ratio.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call