Abstract

Few-shot classification algorithms have gradually emerged in recent years, and many breakthroughs have been made in the research of migration networks, metric spaces, and data enhancement. However, the few-shot classification algorithm based on Graph Neural Network is still being explored. In this paper, an edge-weight single-step memory-constraint network is proposed based on mining hidden features and optimizing the attention mechanism. According to the hidden distribution characteristics of edge-weight data, a new graph structure is designed, where node features are fused and updated to realize feature enrichment and full utilization of limited sample data. In addition, based on the convolution block attention mechanism, different integration methods of channel attention and spatial attention are proposed to help the model extract more meaningful features from samples through feature attention. The ablation experiments and comparative analysis of each training mode are carried out on standard datasets. The experimental results obtained prove the rationality and innovation of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.