Abstract

Transformers play a crucial role in ensuring the safety of power grids. It is of great value to diagnose faults using the large amount of power data generated by the grid. It is possible to detect internal latent faults in transformers in advance of their occurrence if the normal operating condition of the transformer is detected in a timely manner. To perform online fault diagnosis of grid current transformers, we combine the Transformer and BiGRU methods. There is a temporal component to the fault input sample sequences. By using Transformer’s multi-headed attention mechanism to extract deep features from fault input sample sequences, the temporal association between latent variables can be fully exploited. As a result of the extraction of features, BiGRU is used to generate fault category coding as an output. The experimental results indicate that using the proposed algorithm achieves better results than using a single model, which is useful for the study and application of fault diagnosis in power grids for current transformers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call