Abstract

In many practical systems concerning linear system identification with adaptive filters, the performance of the system is intrinsically limited by the problem of missing data, which might arise either because the input or the output data is missing at random time instants or subsets of input or output data are missing. In this paper the problem of linear system identification is studied with only input data missing at random time instants while output data is obtained correctly at all time instants. To model the input data missing process, the available input data sequence is modeled as the product of the actual (unknown) input data sequence with an i.i.d. sequence of Bernoulli random variables with known mean. A new data sequence, called the imputed data sequence is created from the available data sequence where, at any time instant, the imputed data is either the available data at that instant, if it is nonzero, or is set to a constant factor times the imputed data at the previous time instance. This imputed sequence is then used to propose an LMS-type algorithm called Imputation based missing data LMS (ImdLMS). Performance analysis of the algorithm is carried out where the extreme complications and possible intractability vis-a-vis an exact mean square performance analysis is avoided by finding an upper bound of the mean square deviation of the algorithm. Simulation results demonstrate the efficacy of the proposed ImdLMS algorithm, especially when the actual (unknown) input data sequence is correlated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call