Abstract

Recently the collection of varied digital image databases has increased. Many users have found that searching and retrieving required images from large collections are very difficult tasks, and successful and effective retrieval methods were developed to provide an effective and efficient search and retrieval process. In most content-based image retrieval (CBIR) methods, various visual features have been considered indirect to retrieve the images from databases. Content-based medical image retrieval (CBMIR), like any CBIR method, is a technique for retrieving medical images on the basis of automatically derived image features, such as colour and texture. Although a number of methods and approaches have been suggested, retrieval performance remains one of the most challenging problems in current CBMIR studies, this due to the well-known `semantic gap' issue that exists between machine-captured low-level image features and human-perceived high-level semantic concepts. To bridge this gap, much research has been proposed. This study proposes two expansion methods to increase and enhance precision of the retrieval model; both of our proposed methods depend on top-ranked images. Our first expansion method is to reformulate the new expand query image based on mean values for features of top-ranked images, while the second method is based on selecting the most important features only. The expansion process was performed on eighteen colour features and twelve texture features extracted from two common medical images datasets, Kvasir and PH2. Experimental results show that our proposed methods performed better and have a precision of 95.8% and 80.7% for Kvasir and PH2 data set, respectively.

Highlights

  • Imaging is a fundamental component of clinical medicine, and is commonly used for diagnosis [1], care and treatment planning [2], and patient response assessment [3]

  • Experiments were performed and evaluated, colour retrieval was based on colour features only (CLR) associated with our two proposed methods; colour expansion based on mean retrieval (CLEBMR) and colour expansion based on feature selection retrieval (CLEBFSR) methods

  • For texture features there are three experiments : texture retrieval based on texture feature only (TXR), texture expansion based on mean retrieval (TXEBMR) and texture expansion based on feature selection retrieval (TXEBFSR)

Read more

Summary

RESEARCH BACKGROUND

Imaging is a fundamental component of clinical medicine, and is commonly used for diagnosis [1], care and treatment planning [2], and patient response assessment [3]. The idea of image similarity has significant medical implications because diagnostic decision-making has historically involved the use of information from a patient (image and non-image). The use of digital images has become increasingly popular across various sectors, including the medical, science and educational sectors. Since the CBMIR is a part of CBIR, it benefits from the advantages and developments of many methods used in the CBIR method. Kumara et al [4] have developed a method for the automatic semantic annotation of medical images which leverages techniques from content-based image retrieval (CBIR); they extended CBIR methods to automatically annotate liver CT images. A good combination of local texture information derived from more than one different texture descriptor proposed in [5] was widely used in CBMIR.

RELATED STUDIES
SIMILARITY MEASURING
EXPERIMENTAL RESULTS AND DISCUSSION
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.