Abstract

AbstractState‐of‐the‐art object detection models rely on large‐scale datasets for training to achieve good precision. Without sufficient samples, the model can suffer from severe overfitting. Current explorations in few‐shot object detection are mainly divided into meta‐learning‐based methods and fine‐tuning‐based methods. However, existing models do not focus on how feature maps should be processed to present more accurate regions of interest (RoIs), leading to many non‐supporting RoIs. These non‐supporting RoIs can increase the burden of subsequent classification and even lead to misclassification. Additionally, catastrophic forgetting is inevitable in both few‐shot object detection models. Many models classify directly in low‐dimensional spaces due to insufficient resources, but this transformation of the data space can confuse some categories and lead to misclassification. To address these problems, the Feature Reconstruction Detector (FRDet) is proposed, a simple yet effective fine‐tune‐based approach for few‐shot object detection. FRDet includes a region proposal network (RPN) based on channel attention and space attention called Multi‐Attention RPN (MARPN) and a head based on feature reconstruction called Feature Reconstruction Head (FRHead). MARPN utilizes channel attention to suppress non‐supporting classes and spatial attention to enhance support classes based on Attention RPN, resulting in fewer but more accurate RoIs. Meanwhile, FRHead utilizes support features to reconstruct query RoI features through a closed‐form solution, allowing for a comprehensive and fine‐grained comparison. The model was validated on the PASCAL VOC, MS COCO, FSOD, and CUB200 datasets and achieved better results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.