Abstract

Traditional non-intrusive load monitoring (NILM) methods rely on massive historical labeled data. However, due to the privacy and high labeling cost of datasets, their generality and feasibility are limited, with poor performance in identifying devices with multiple states or similar features. To address this issue, this paper proposes an SSCL-LM framework based on temporal convolutional network (TCN), integrating contrastive self-supervised learning into NILM. Initially, through contrastive self-supervised pre-training, the framework learns well-represented load temporal characteristics from abundant unlabeled 1D power data. Then a small amount of labeled data is used to fine-tune the classifier to learn load categories represented by different temporal characteristics. Finally, test data is inputted into the model for load identification. Validating on the REDD dataset, the results demonstrate that with only 30% labeled data for fine-tuning, the proposed method achieves a 4.04% higher F1 score compared to traditional supervised methods. Moreover, utilizing only 1D power features, this method exhibits superior identification performance for devices with multiple states or similar features. Furthermore, this method enables performance transfer among loads of the same device with different parameters across different households, verifying its strong generalization and practicality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.