Abstract
In the current era of digital communication, the use of images is growing exponentially since they are one of the best ways of expressing, sharing and memorizing knowledge. In fact, images can be used in various real-world applications, like biology, medical diagnosis, space research, remote sensing, etc. However, finding the most relevant images that meet the users??? needs is a challenging task, especially when the search is performed over gigantic amounts of images. This has led to the emergence of several image retrieval studies during the past two decades. Typically, research studies in this area were focused on the Content-based Image Retrieval (CBIR). However, extensive research have proved that there is a ???semantic gap??? between the visual information captured by the imaging devices and the image semantics understandable by humans. As an alternative, researchers??? efforts have been oriented towards the Text-based Image Retrieval (TBIR). Indeed, TBIR is a typical method that helps bridge the issue of ???semantic gap??? between the low-level image features and the high-level image semantics. Its policy consists in associating textual descriptions with the images, which constitute the focus of the research queries later on. In this paper, we analyze various image annotation methods, namely: Visual Content-based and Users??? Tags-based Image Annotation Methods. In particular, we focus on the visual content-based image annotation techniques since they are one of the dynamic research fields nowadays.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.