Due to the rapid growth of labels and high-dimensional data, multi-label feature selection has attracted increasing attention. However, two common issues are ignored by existing multi-label feature selection methods: 1) The pseudo-label matrix is constructed by using logical label matrix directly. 2) There is an imbalance of label density in multi-label data. To tackle the mentioned issues, we innovatively propose a new method named Label relaxation and Shared information for Multi-label Feature Selection (LSMFS). Specifically, LSMFS combines a logical label matrix with a non-negative label relaxation matrix to fit a pseudo-label matrix, which is used for learning the correlations of class labels. LSMFS uses the feature weight matrix to capture a shared information of different related labels, which mitigates the influence of low-density labels on feature selection. The above principles are transformed into the objective function of LSMFS, and an alternate iterative optimization algorithm is developed to solve the objective function. Experiments on multi-label datasets from different domains have demonstrated the effectiveness of the proposed method LSMFS. Code is released at https://github.com/HQUF/LSMFS.