Concept factorization (CF) is a powerful tool in subspace learning. Recently graph-based CF and local coordinate CF have been proposed to exploit the intrinsic geometrical structure of data, and have been shown quite successful in improving performance. However, these methods have limited robustness and might be sensitive to noises and disturbances in practical applications. In this paper, we propose a novel robust sparse CF framework (RSCF) for subspace learning. Specifically, we present a robust loss function to effectively eliminate the impact of the large outliers. The local coordinate constraint and the graph regularization term are incorporated into RSCF to simultaneously guarantee the sparsity of the coefficient matrix and maintain the local structure of the data. We prove that the local coordinate constraint implies the orthogonality of the coefficient matrix. By using the half-quadratic optimization technique, we transform the objective function of RSCF into a quadratic form and provide the iterative updating rules and the convergence analysis. Extensive experiments demonstrate the robustness and superiority of the proposed RSCF in comparison to state-of-the-art CF methods.