Abstract

Fine-grained image recognition has gained increasingly popular as a sub-domain of computer vision, where numerous works have been proposed. These works mainly focus on researching discriminative region localization and fine-grained feature learning independently instead of simultaneously. However, we believe that the relationship between them absolutely deserves more attention. In this paper, we aim to propose an improved recurrent attention convolution neural network (RA-CNN) as we consider discriminative region localization and fine-grained feature learning can reinforce with each other. RA-CNN includes three scales, each of which consists of a classification sub-network and an attention proposal subnetwork (APN). The procedure of RA-CNN can be described by three steps. Firstly, discrimination features can be extracted from the input image in the first scale. Secondly, the extracted discriminative feature is trained by APN to get the information of attention region. Third, we crop the attention region and zoom in it to be the input of the next scale. Experiments on CUB-200-2001 dataset show our method is capable of handling find-grained image recognition issues with 76.86% and 94.06% precision@1 and percision@5, respectively. Our method outperforms the original RA-CNN with a large margin. The strength of our method lies in the fact that it can gradually generate the most discriminative regions from coarse to fine. Consequently, our method has achieved higher precision than others.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.