Machine learning (ML) applications often require large datasets, a requirement that can pose a major challenge in fields where data is sparse or inconsistent. To address this issue, we propose a novel approach that combines prior knowledge with data-driven methods to significantly reduce data dependency. This study represents a disentangled system compositionality knowledge by the method of Component-Based Machine Learning (CBML) in the context of energy-efficient building engineering. In this way, CBML incorporates semantic domain knowledge within the structure of a data-driven model. To understand the advantage of CBML, we conducted a case experiment to assess the effectiveness of this knowledge-encoded ML approach in scenarios with sparse data input (1 % - 0.0125 % sampling rate) and several typical ML methods. Our findings reveal three key advantages of this approach over traditional ML methods: 1) It significantly improves the robustness of ML models when dealing with extremely small and inconsistent datasets; 2) It allows for efficient utilization of data from diverse record collections; 3) It can handle incomplete data while maintaining high interpretability and reducing training time. These features offer a promising solution to the challenges associated with deploying data-intensive methods and contribute to more efficient real-world data usage. Additionally, we outline four essential prerequisites to ensure the successful integration of prior knowledge and ML generalization in target scenarios and open-sourced the code and dataset for community reproduction.
Read full abstract7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access