Abstract
This paper proposes an interpretable incremental voltage-current representation attention convolution neural network for the non-intrusive load monitoring (NILM) task. The proposed method consists of two parts: (i) the voltage-current representation attention mechanism in the proposed network is designed in collaboration with the data pre-processing method. They provide the role for the classification function of neural networks.; (ii) this paper proposed an adaptive distillation incremental learning method that introduced incremental learning into the NILM field. In this work, the public dataset plug-load appliance identification dataset is used to validate the proposed voltage-current representation attention mechanism and adaptive distillation incremental learning method in this paper. In addition, the performance of the proposed algorithms is also complemented in this paper using a private dataset. According to the experimental results, the performance of the proposed method in this paper is better than the comparison methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.