In this paper, we introduce a novel approach to enhance the accuracy and convergence behavior of Self-Organizing Maps (SOM) by incorporating a reweighted zero-attracting term into the loss function. We evaluated two SOM versions: conventional SOM and robust adaptive SOM (RASOM). The enhanced versions, reweighted zero-attracting SOM (RZA-SOM) and reweighted zero-attracting RASOM (RZA-RASOM), include an l1 norm in the error function to add a zero-attractor term, which improves weight coefficient adjustments while preserving topology. The models were assessed for convergence speed and misadjustment under sparsity assumptions of the true coefficient matrix, and their robustness was tested under conditions of increased non-zero taps. Using six different datasets, we compared the performance of RZA-SOM and RZA-RASOM against conventional SOM and RA-SOM in terms of accuracy, quantization error, and topology preservation. Experimental results consistently demonstrated that RZA-SOM and RZA-RASOM surpassed the performance of conventional SOM and RA-SOM.