Abstract
Deep learning models trained on one building can be inaccurate when applied to different buildings because of the lack of generalizability and scalability of models. This major limitation poses a challenge when scaling deep learning algorithms to a larger set of systems. This paper presents a cluster-based deep transfer learning (CBDTL) model with an attention mechanism as a potential solution to this problem. The CBDTL model allows some attributes learned on one system to be reused to build a model for a different but related system by taking advantage of already trained models. First, CBDTL groups the power meters based on their consumption patterns in residential buildings. Next, a sequence-to-sequence (seq2seq) model with an attention mechanism is used as the base forecasting model for zone temperature and power consumption predictions. The first model within each cluster is trained using Bayesian optimization and all other models transfer knowledge from the first model. Compared with the seq2seq model alone, the seq2seq model with the attention can better predict the dynamics of the complicated physical processes of air conditioners under various conditions. The CBDTL model can be retrained based on only three weeks of three-month data from a target air conditioner by freezing the encoder layers of a seq2seq model that is pre-trained on the data from the source residential buildings. Results indicate that the CBDTL model with attention accurately predicts both the zone temperature and power consumption, demonstrating a prediction accuracy that is almost a whole order of magnitude higher than that of a comparison model trained on three months of data from a target air conditioner. Overall, the results demonstrate that the CBDTL model proposed in this study is effective in improving the prediction accuracy and reducing the training time of forecasting models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.