The paper deals with the problem of how to condense efficiently the used basis vectors in the decision function of support vector machines (SVM). Most existing methods learn the basis vectors and the corresponding coefficients by maximizing the margin between different classes. They ignore the key fact that condensing the solution of SVM is equivalent to constructing the SVM model in the transformation space determined by the basis vectors. Thus, the radius-margin bound, which is related with the generalization ability of SVM, changes with them. In the paper, we propose a novel method called sparse support vector machine guided by radius-margin bound (RMB-SSVM) to learn the condensed solution for SVM. The key characteristic of RMB-SSVM is that it employs a criterion that exploits integratedly the margin and the radius of the minimum hypersphere enclosing the data to guide the selecting of the basis vectors and the learning of the corresponding coefficients. Thus, the learning criterion of RMB-SSVM is directly related with the generalization ability of SVM and so it can yield better performance. We develop an effective method to handle the proposed optimization model, and propose a heuristic scheme to select the basis vectors used in the decision function. Further, we explore how to use the maximum pairwise distance over the pairs of data points to approximate the radius in our methodology. This can simplify the proposed model and reduce the training time while not drastically impairing the performance. Finally, we conduct the comprehensive experiments to demonstrate the effectiveness and superiority of the proposed methods by comparing them with the counterparts.