Classical matrix completion methods are not effective in recovering missing entries of data drawn from multiple subspaces because the matrices are often of high-rank. Recently a few advanced matrix completion methods were proposed to solve the problem but they are not scalable to large matrices and big data problems. This paper proposes a sparse factorization method for matrix completion on multiple-subspace data. The method factorizes the given incomplete matrix into a dense matrix and a sparse matrix, while the factorization errors of the observed entries are minimized. To solve the optimization problem, an accelerated proximal alternating linearized minimization (APALM) algorithm is proposed. As a non-trivial task owing to the alternation, linearization, nonconvexity, and extrapolation, the convergence of APALM is proved. APALM can solve a large class of optimization problems such as matrix factorization with nonsmooth regularizations. In addition, we show that, to recover an $m\times n$ m × n matrix consisting of data drawn from $k$ k subspaces of dimension $r_0$ r 0 , the number of observed entries required in our matrix completion method is $O(nr_0\,\log k\,\log n)$ O ( n r 0 log k log n ) while that in conventional methods is $O(nr_0k\,\log n)$ O ( n r 0 k log n ) , which theoretically proves the superiority of our method on multiple-subspace data and high-rank matrices. The proposed matrix completion method is compared with state-of-the-art on synthetic data and real collaborative filtering problems. The experimental results corroborate that the proposed method can handle large matrices efficiently and provide high recovery accuracy.