In real-world applications, a single instance could have more than one label. To solve this task, multi-label learning methods emerged in recent years. It is a more challenging problem for many reasons, such as complex label correlation, long-tail label distribution, and data shortage. In general, overcoming these challenges and bettering learning performance could be achieved by utilizing more training samples and including label correlations. However, these solutions are expensive and inflexible. Large-scale, well-labeled datasets are difficult to obtain, and building label correlation maps requires task-specific semantic information as prior knowledge. To address these limitations, we propose a general and compact Multi-Label Correlation Learning (MUCO) framework. MUCO explicitly and effectively learns the latent label correlations by updating a label correlation tensor, which provides highly accurate and interpretable prediction results. In addition, a multi-label generative strategy is deployed to handle the long-tail label distribution challenge. It borrows the visual clues from limited samples and synthesizes more diverse samples. All networks in our model are optimized simultaneously. Extensive experiments illustrate the effectiveness and efficiency of MUCO. Ablation studies further prove the effectiveness of all the modules.