Abstract

Neural Architecture Search (NAS) had produced highly competitive results in generating neural architectures for many deep learning applications, some of them achieving state of the art performances. Even though there are many Recurrent Neural Network (RNN) variations like Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), etc., available for electric load demand forecasting, finding an optimal internal structure of RNN is still of much interest. This work uses one of the NAS algorithms - Differentiable Architecture Search (DARTS) to generate a new RNN cell optimum for electric load demand forecasting. The generated RNN cell is used to construct models of different complexity - from single cell model to multi-layer models obtained by stacking these RNN cells appropriately. These models are compared with other popular RNN models, and the results establish the advantage of customizing the internal RNN structure over the general RNN variants.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.