Abstract

How can interest point detectors benefit from contextual cues? In this articles, we introduce a context-aware semi-local detector (CASL) framework to give a systematic answer with three contributions: (1) We integrate the context of interest points to recurrently refine their detections. (2) This integration boosts interest point detectors from the traditionally local scale to a semi-local scale to discover more discriminative salient regions. (3) Such context-aware structure further enables us to bring forward category learning (usually in the subsequent recognition phase) into interest point detection to locate category-aware, meaningful salient regions. Our CASL detector consists of two phases. The first phase accumulates multiscale spatial correlations of local features into a difference of contextual Gaussians (DoCG) field. DoCG quantizes detector context to highlight contextually salient regions at a semi-local scale, which also reveals visual attentions to a certain extent. The second phase locates contextual peaks by mean shift search over the DoCG field, which subsequently integrates contextual cues into feature description. This phase enables us to integrate category learning into mean shift search kernels. This learning-based CASL mechanism produces more category-aware features, which substantially benefits the subsequent visual categorization process. We conducted experiments in image search, object characterization, and feature detector repeatability evaluations, which reported superior discriminability and comparable repeatability to state-of-the-art works.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.