Abstract

Graph total variation (GTV) is a widely employed regularization in graph-based semi-supervised learning (GSSL), which enforce the piece-wise smoothness of the label values concerning the underlying graph structure. However, due to its inherently biased estimation, GTV regularization tends to cause an underestimation of label values on boundary nodes . In this paper, we propose a novel GSSL model with non-convex GTV regularization. Specifically, we construct the non-convex GTV regularization by subtracting the generalized Moreau envelope from the original GTV regularization term, which can reduce the estimation bias and thus effectively estimate the label values of the boundary nodes. We demonstrate that under certain conditions of the non-convex control parameter, our proposed GSSL model maintains global convexity, thereby providing a theoretical guarantee for algorithm convergence. Additionally, we present an efficient alternating direction multiplier method (ADMM) to solve the proposed model. Finally, we validate the effectiveness on both synthetic data and background subtraction. The proposed GSSL model outperforms state-of-the-art model with a 17.97% improvement in SNR value on synthetic data, and increases average recall, average precision, and average F-measure by 4.92%, 1.19%, and 1.57% respectively on the pan–tilt-zoom(PTZ) challenge of background subtraction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.