In multi-label learning, label enhancement (LE) aims to recover label distributions from logical labels, thereby reinforcing supervision information in the training set. Existing LE algorithms mainly leverage pairwise similarities between instances to recover label distributions. However, symmetric similarities, which are widely used, are incapable of reflecting individual differences between instances. In this paper, we propose an asymmetric influence between instances to fully explore the information contained in logical labels. First, we build a bipartite network and employ the mass diffusion algorithm on it to calculate the asymmetric influence between instances. Such asymmetry distinguishes the influence of an instance on others from the influence it receives from others. Second, to cope with the new influence, we construct a graph with self-connections. In this way, we allow instances to be influenced by themselves. Extensive experiments on thirteen benchmark datasets demonstrate the superiority of the proposed method over six state-of-the-art LE algorithms.