Abstract
Real-time smart grid monitoring is critical to enhancing resiliency and operational efficiency of power equipment. Cloud-based and edge-based fault detection systems integrating deep learning have been proposed recently to monitor the grid in real time. However, state-of-the-art cloud-based detection may require uploading a large amount of data and suffer from long network delay, while edge-based schemes do not adequately consider the detection requirement and thus cannot provide flexible and optimal performance. To solve these problems, we study a cloud-edge based hybrid smart grid fault detection system. Embedded devices are placed at the edge of the monitored equipment with several lightweight neural networks for fault detection. Considering limited communication resources, relatively low computation capabilities of edge devices, and different monitoring accuracies supported by these neural networks, we design an optimal communication and computational resource allocation method for this cloud-edge based smart grid fault detection system. Our method can maximize the processing throughput of the system and improve resource utilization while satisfying the data transmission and processing latency requirements. Extensive simulations are conducted and the results show the superiority of the proposed scheme over comparison schemes. We have also prototyped this system and verified its feasibility and performance in real-world scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.