Abstract

Mobile visual search is a new class of applications that use images taken by camera phone to initiate search queries. It is a very challenging task mainly because of image affine transformations caused by viewpoints changes, and motion blur due to hand tremble. These problems are unavoidable in mobile visual search and often result in low recall. Query expansion is an effective strategy for recall improvement, but existing methods are highly memory and time consuming, and often involve lots of redundant features. Integrating robust local patch mining and geometric parameter coding, this paper proposes an accurate offline query expansion method for large-scale mobile visual search. Concretely, a novel criterion is presented for robust patch evaluation and mining. Then multiple representative features are extracted from these selected local patches to deal with viewpoint changes. Moreover, the geometric parameter of each representative viewpoint is also recorded, to support fast and accurate feature matching. Experimental results on several well-known datasets and a large image set (1M) have demonstrated the effectiveness and efficiency of our method, especially its high robustness to viewpoint changes. The proposed approach can also be well generalized to other multimedia content analysis tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.