Semisupervised learning (SSL) has been widely used in numerous practical applications where the labeled training examples are inadequate while the unlabeled examples are abundant. Due to the scarcity of labeled examples, the performances of the existing SSL methods are often affected by the outliers in the labeled data, leading to the imperfect trained classifier. To enhance the robustness of SSL methods to the outliers, this article proposes a novel SSL algorithm called Laplacian Welsch regularization (LapWR). Specifically, apart from the conventional Laplacian regularizer, we also introduce a bounded, smooth, and nonconvex Welsch loss which can suppress the adverse effect brought by the labeled outliers. To handle the model nonconvexity caused by the Welsch loss, an iterative half-quadratic (HQ) optimization algorithm is adopted in which each subproblem has an ideal closed-form solution. To handle the large datasets, we further propose an accelerated model by utilizing the Nyström method to reduce the computational complexity of LapWR. Theoretically, the generalization bound of LapWR is derived based on analyzing its Rademacher complexity, which suggests that our proposed algorithm is guaranteed to obtain satisfactory performance. By comparing LapWR with the existing representative SSL algorithms on various benchmark and real-world datasets, we experimentally found that LapWR performs robustly to outliers and is able to consistently achieve the top-level results.