Graph total variation (GTV) is a widely employed regularization in graph-based semi-supervised learning (GSSL), which enforce the piece-wise smoothness of the label values concerning the underlying graph structure. However, due to its inherently biased estimation, GTV regularization tends to cause an underestimation of label values on boundary nodes . In this paper, we propose a novel GSSL model with non-convex GTV regularization. Specifically, we construct the non-convex GTV regularization by subtracting the generalized Moreau envelope from the original GTV regularization term, which can reduce the estimation bias and thus effectively estimate the label values of the boundary nodes. We demonstrate that under certain conditions of the non-convex control parameter, our proposed GSSL model maintains global convexity, thereby providing a theoretical guarantee for algorithm convergence. Additionally, we present an efficient alternating direction multiplier method (ADMM) to solve the proposed model. Finally, we validate the effectiveness on both synthetic data and background subtraction. The proposed GSSL model outperforms state-of-the-art model with a 17.97% improvement in SNR value on synthetic data, and increases average recall, average precision, and average F-measure by 4.92%, 1.19%, and 1.57% respectively on the pan–tilt-zoom(PTZ) challenge of background subtraction.