Abstract

AbstractThe detection of power lines is critical for flight safety, especially for drones and low-flying aircraft. Power line detection models help prevent collisions, reducing potential damage and preserving lives, while also safeguarding critical infrastructure. This has led to significant research efforts to develop accurate detection models. In this study, we employ paired infrared–visible power line datasets to train three distinct deep learning models. The first two models are sequential deep learning models based on VGG16 and AlexNet networks. They are tailored for detection in visible images, while they were optimized again for infrared images. For the third model, we introduce an innovative deep learning architecture utilizing Functional Application Programming Interface, affording us the flexibility to construct a multi-input model with shared layers. Our proposed model accepts paired images (visible and infrared) as inputs. Then, a feature-level fusion process is applied to merge the extracted features from both inputs and generate an enriched feature map. This approach amalgamates the advantages of visible images, which boast high resolution and rich texture features, with infrared images, which excel in high contrast and clear vision under adverse environmental conditions. Comparing the outcomes of the three models, our proposed model emerges as the front runner, boasting an impressive accuracy rate of 99.37%. Moreover, real-time processing was adopted by conducting ablation experiments to optimize the model and reduce the number of trainable parameters, resulting in an inference speed of 2.7 milliseconds per frame.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.