Abstract

In this paper, we investigate the convergence theory of large margin unified machines (LUMs) in a non-i.i.d. sampling. We decompose the total error into sample error, regularization error and drift error. The appearance of drift error is caused by the non-identical sampling. Independent blocks sequences are constructed to transform the analysis of the dependent sample sequences into the analysis of independent blocks sequences under some mixing conditions. We also require the assumption of polynomial convergence of the marginal distributions to deal with the non-identical sampling. A novel projection operator is introduced to overcome the technical difficulty caused by the unbounded target function. The learning rates are explicitly derived under some mild conditions on approximation and capacity of the reproducing kernel Hilbert space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call