Abstract

Linear prediction is extensively used in modeling, compression, coding, and generation of speech signal. Various formulations of linear prediction are available, both in time and frequency domain, which start from different assumptions but result in the same solution. In this letter, we propose a novel, generalized formulation of the optimal low-order linear prediction using the fractional (non-integer) derivatives. The proposed fractional derivative formulation allows for the definition of predictor with versatile behavior based on the order of fractional derivative. We derive the closed-form expressions of the optimal fractional linear predictor with restricted memory, and prove that the optimal first-order and the optimal second-order linear predictors are only its special cases. Furthermore, we empirically prove that the optimal order of fractional derivative can be approximated by the inverse of the predictor memory, and thus, it is a priori known. Therefore, the complexity is reduced by optimizing and transferring only one predictor coefficient, i.e., one parameter less in comparison to the second-order linear predictor, at the same level of performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call