Flight regime recognition is critical in guaranteeing flight safety, making informed maintenance decisions for key components, and evaluating flight quality. However, previous methods take flight data points/segments as inputs to make point-wise/segment-wise prediction, which results in imprecise regime boundary localization, poor recognition accuracy or low efficiency. To this end, an intelligent temporal detection network is proposed from a brand-new perspective of regime detection, which directly handles long flight sequences and concurrently generates multiple regime boxes with corresponding categories. Moreover, specific model structures are designed for flight data, including an adaptive graph embedding to explore spatial relations of multi-modal flight parameters, a multi-scale Transformer encoder to recognize regimes of different durations, and a balanced joint loss to mitigate negative impact on imbalanced flight regimes. To validate the superiority of the proposed method, various parameters of real-world flight sorties are collected. Then flight regime dataset is constructed via manual annotation. Extensive experiments and ablation studies show that our method can achieve accurate and boundary-sensitive regime recognition simultaneously.
Read full abstract