In recent years, feature selection methods based on sparse regression have attracted much attention from researchers, and how to select more representative feature is the key point. In this paper, an unsupervised feature selection method based on redundancy learning and sparse regression(RSUFS) is proposed. Firstly, to make the model robust to outliers, this paper uses the l2,1-norm regression model as the loss function to learn the feature weight matrix. Secondly, in order to get exact k top features, l2,0-norm constraint is introduced. At the same time, the cosine similarity between features is taken into account to select more valuable features by reducing the redundancy between features. Finally, an efficient algorithm based on Augmented Lagrangian method is derived to solve the above optimization problem. Comparison experiments are made with some benchmark datasets and seven well-known unsupervised feature selection algorithms and the results show that the given algorithm is effective.