Abstract

Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described.

Highlights

  • In time series analysis, the autoregressive (AR) model, called the linear prediction model, has received considerable attention, with a host of results on fitting AR models, AR model prediction performance, and decision-making using time series analysis based on AR models [1,2,3]

  • We suggest that the log ratio of entropy powers as the model order is increased as presented in Section 5.3 of the current paper can serve as an additional indicator of a suitable AR model order through Equations (33) and (34) with the entropy powers replaced by mean squared prediction error (MSPE) as in Equation (40)

  • We present a new quantity, the log ratio of entropy powers, for investigating the changes in mutual information and differential entropy as the predictor order is incremented in autoregressive models or for evaluating the overall change in differential entropy and mutual information for a selected AR model order

Read more

Summary

Introduction

The autoregressive (AR) model, called the linear prediction model, has received considerable attention, with a host of results on fitting AR models, AR model prediction performance, and decision-making using time series analysis based on AR models [1,2,3]. We consider the applications of the AR model in geophysical exploration [7,8], electrocardiogram (ECG) classification [9], and electroencephalogram (EEG) classification [10,11] and setup how the log ratio of entropy powers can provide new insights and results for these fields For each of these applications, the interpretation of change in mean squared prediction error for different predictor orders in terms of changes in differential entropy and changes in mutual information open up new analysis and classification paradigms. In his landmark 1948 paper [12], Shannon defined the entropy power ( called entropy rate power) to be the power in a Gaussian white noise limited to the same band as the original ensemble and having the same entropy. Later we develop a closely related result for any distribution using the definition of entropy power

Autoregressive Models
The Power Spectral Density
The Levinson-Durbin Recursion
Minimum MSPE and AR Models
Log Ratio of Entropy Powers
Gaussian Distributions
Laplacian Distributions
Increasing Predictor Order
Maximum Entropy Spectral Estimate
Orthogonal Decompositions and Whitened Prediction Errors
Experimental Examples
Application to Speech Coding
Speech Waveform Coding
Code-Excited Linear Prediction
Other Possible Applications
ECG Classification
EEG Classification
Geophysical Exploration
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.