Subgroups For Detection Transformer

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon

Recent advancements in DETR for object detection have led to innovative techniques to improve efficiency. The challenge is effectively using object queries, which capture content and positional information. Several method increase the number of queries from 300 to 1800, but redundancy is introduced, resulting in only one group being used during inference. Interestingly, we have observed that detection’s with lower confidence levels often result in more accurate bounding box predictions than their higher-confidence counterparts. To exploit this untapped potential, we introduce SG-DETR (Sub-Group Detection Transformer), a plug-and-play method that maximizes the utilisation of object queries. SG-DETR partitions object queries into subgroups with an equal number per subgroup, reducing the impact of negative queries. Additionally, we suppress redundant boxes by clustering those of the same size and position. Our method not only mitigates the issue of negative queries but also enhances various DETR-like models, including Deformable, Conditional and DAB DETR architectures, through experiment on the large-scale COCO2017 dataset.

Save Icon
Up Arrow
Open/Close