Abstract. Due to the influence of nonlinear radiation distortion and geometric deformation, achieving multimodal image matching remains a challenging task. To address these issues, this paper proposes a method called radiation invariant phase correlation (RIPC) to simultaneously estimate the rotation, scale, and displacement changes of multimodal image pairs. Firstly, based on the local structure characteristics of the image itself, we harness the nonlinear invariance of kernel canonical correlation analysis to devise the multimodal local self-correlation (MLSC) descriptor. This descriptor is resilient to nonlinear radiative differences, as well as local rotation and scale variations. Subsequently, we incorporate the log-polar coordinate transformation to capture the overall rotation and scale changes in the image, enabling independent representation of these factors on the Cartesian coordinate system. Finally, drawing upon the continuity of displacement estimation, as well as rotation and scale estimation, we construct a five-dimensional descriptor tailored for phase correlation. Extensive experiments conducted on five open-source datasets demonstrate that our proposed method surpasses state-of-the-art (SOTA) techniques in matching performance. Furthermore, our RIPC method achieves matching accuracy within 2-pixel threshold, which underscores its effectiveness in multimodal remote sensing image matching.