Abstract

Intrusion detection is a crucial task in power grid surveillance system by providing early warning for power grid security. Construction machinery and engineering vehicles, as the most common intrusion objects, have become the major concern for preventing external damages in power grid maintenance. In this paper, by considering the diversity of scales of intrusion objects and complexity of application scenarios under power grid surveillance, we compiled a dataset which contains 8177 images captured by 653 different power grid surveillance cameras. Based on this dataset, we proposed an improved context-aware mask region-based convolutional neural network (Mask R-CNN) model, namely ID-Net, for intrusion object detection. A modulated deformable convolutional operation is integrated into the backbone network for learning robust feature representations from geometric variations in engineering vehicles. By considering the correlation between objects and their context, a self-attention-based module is leveraged for long-range context relation modeling. For small objects detection, a feature integration module is applied for multi-scale feature fusion under a pyramid hierarchy. Then, a cascaded coarse-to-fine region proposal network is incorporated for progressively refining the bounding box location regression. Experimental results have demonstrated that our model can achieve competitive performance in comparison with state-of-the-art object detection methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.