Abstract

Hidden Markov models are a well-known probabilistic graphical model for time series of discrete, partially observable stochastic processes. We consider the method to extend the application of hidden Markov models to non-Gaussian continuous distributions by embedding a priori probability distribution of the state space into reproducing kernel Hilbert space. Corresponding regularization techniques are proposed to reduce the tendency to overfitting and computational complexity of the algorithm, i.e. Nystr¨om subsampling and the general regularization family for inversion of feature and kernel matrices. This method may be applied to various statistical inference and learning problems, including classification, prediction, identification, segmentation, and as an online algorithm it may be used for dynamic data mining and data stream mining. We investigate, both theoretically and empirically, the regularization and approximation bounds of the discrete regularization method. Furthermore, we discuss applications of the method to real-world problems, comparing the approach to several state-of-the-art algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call