Abstract

Due to extremely high temperatures, friction, and vibrations, aircraft engines tend to have various types of internal damage. To guarantee the safety of the aircraft, maintenance for aircraft engines is regularly performed by manual borescope inspection, which is time consuming and error prone. Existing studies adopted deep learning-based approaches to detect potential damage in borescope images to accelerate the process of engine maintenance. However, these approaches are designed for damage detection from static images and cannot be directly used to track damage in borescope videos due to the extremely high computational cost. To detect and track the damage in borescope videos in real time, we propose the deep fusion network (DFNET), which works along two parallel functional paths: i.e., the segmentation path and the spatial warping path. The segmentation path only runs on selected key frames to extract semantic features, which are then propagated to other frames through optical flows in the spatial warping path. The performance and efficiency of the DFNET are validated through extensive experiments using real borescope videos from a local air carrier.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.