Multimodal medical image registration involves multiple pieces of medical equipment collecting complementary information on the same content, resulting in information fusion. Beyond its theoretical significance, it has practical applications in areas such as disease diagnosis, patient follow-up, surgical navigation, and radiology. Determining the optimal geometric-space transformation parameters is a major challenge in this field. To address this, we propose the opposition-based Hunger Games Search (OHGS) algorithm, an improved version of the Hunger Games Search (HGS) algorithm. The OHGS algorithm aims to find the optimal translation, scaling, and shearing parameters that result in the highest similarity value between the images being registered. By incorporating opposition-based learning (OBL) and dynamically adjusting mutation criteria, the OHGS algorithm has improved exploration capabilities and the ability to escape local optima. In this study, OHGS was compared with 9 other algorithms developed in recent years on 23 well-known benchmark functions and 10 single-objective optimization functions from IEEE CEC2020. Furthermore, to achieve better registration accuracy and optimal alignment of the geometric space in multimodal medical images, OHGS and 8 other algorithms were tested using RIRE and BraTS datasets, optimizing the two similarity measures of Normalized Mutual Information (NMI) and Structural Similarity Index Measure (SSIM). The results showed that OHGS significantly improved performance on benchmark functions and multimodal medical image registration problems, thereby expanding the application range of the HGS algorithm.
Read full abstract