Abstract

A method for Bayesian factor analysis (FA) of large matrices is proposed. It is assumed that a small number of matrix elements are initially observed, and the statistical FA model is employed to actively and sequentially select which new matrix entries would be most informative, in order to estimate the remaining missing entries, i.e., complete the matrix. The model inference and active learning are performed within an online variational Bayes (VB) framework. A fast and provably near-optimal greedy algorithm is used to sequentially maximize the mutual information contribution from new observations, taking advantage of submodularity properties. Additionally, a simple alternative procedure is proposed, in which the posterior parameters learned by the Bayesian approach are directly used. This alternative procedure is shown to achieve slightly higher prediction error, but requires much fewer computational resources. The methods are demonstrated on a very large matrix factorization problem, namely the Yahoo! Music ratings dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call