Multi-Label Classification (MLC) extends standard classification in the sense that an instance might belong to multiple labels simultaneously. Many lazy approaches to MLC have been proposed so far. The majority of them, to classify an instance, use statistical estimators from the neighboring instances based on classical probability theory. In this work, we propose lazy algorithms for MLC that employ the Non-Parametric Predictive Inference Model (NPI-M) for the statistical estimators based on the neighboring instances. It is shown that our proposed lazy MLC algorithms are more suitable to tackle the class-imbalance problem that usually arises in MLC, especially when data contain label noise. This issue is corroborated via an exhaustive experimental analysis.