Abstract

Differential protection, as the key protection element in the power transformers, has always been threatened with sending false trips subjected to external transient disturbances. As a result, differential protection needs an additional block to distinguish between internal faults and external transient disturbances. The protection system should, first, be able to perform based on raw data, second, be able to learn fully temporal features and sudden changes in the transient signals, and, third, impose no assumption on noise. To address these challenges, a fast RNN, namely fast gated recurrent neural network (FGRNN). By removing reset gate in the gated recurrent unit (GRU), the proposed network is capable of learning abrupt changes in addition to significantly reducing the computational time. Furthermore, a loss function based on an information theory concept is formulated in this article to enhance the learning ability as well as robustness against non-Gaussian/Gaussian noises. A generalized form of mutual information is also adopted to form a noise model-free loss function, then incorporated with the designed deep network. Simulated and experimental examinations engaging various external factors, in addition to comparison between the proposed FGRNN, GRU, and seven firmly established methods indicates the faster and more reliable performance of the proposed algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.