We study the problem of low-rank and sparse decomposition from possibly noisy observations, known as Robust PCA, where the sparse component can be seen as outliers. We first propose a modified objective function where the nuclear norm captures the low-rank term, $\ell _0$ -“norm” addresses the sparse outlier term, and an $\ell _1$ -norm to deal with the additive noise term. The associated algorithm, termed sparsity regularized principal component pursuit (SRPCP), is shown to converge. Under certain model and algorithm parameter settings, it is shown that SRPCP can recover the low-rank component and sparse component exactly in the noiseless case. In the noisy case, we first prove that the widely used principal component pursuit (PCP) method, which was designed for the noiseless case, is actually stable to dense noise. Then, we show that SRPCP has smaller estimation error bound, and can identify outlier entries without any false alarm. Another important byproduct of our analysis is the result that PCP with missing entries is also stable to dense noise. We further propose another objective function which replaces the above nuclear norm by the log-determinant. The proposed algorithm, termed iterative reweighted sparsity regularized principal component pursuit, is also shown to converge. In each iteration, it solves a weighted nuclear norm regularized robust matrix completion problem. We propose an alternating direction method of multipliers algorithm to solve this nonconvex subproblem, which also converges. Empirical studies demonstrate the efficacy of the proposed $\ell _{0}$ - $\ell _{1}$ regularization framework to deal with the outliers as well as its advantage over the existing state-of-the-art methods.