Abstract
In the real-world scenarios, challenges such as changes in viewpoint, variations in lighting conditions, and the presence of blur effects are pervasive issues faced in the field of image registration, making the rapid and accurate establishment of correspondences between images exceedingly difficult. Presently, encoder-based methods often entail larger parameter sizes and longer execution times. Additionally, when dealing with low-texture images, reliable keypoint registration becomes particularly challenging, leading to further increases in registration costs. To address such issues, this study proposes a novel Efficient Feature Registration Network (EFRNet) that integrates low-frequency texture information with high-frequency contour information to achieve efficient and stable feature registration. The proposed EFRNet possesses several significant advantages. Firstly, the introduced Free Receptive Field (FreeRF) module dynamically adapts to allocate extractors for each feature map and discards irrelevant feature information, thereby facilitating the extraction of features at different frequencies and achieving model lightweighting. Secondly, the Add Edge Margin Global Feature Registrant (AemGFR), incorporating a cross-attention architecture, flexibly adjusts feature interaction processes to meet the requirements of accurate registration. Moreover, by introducing a Gaussian distribution of edge values, the AemGFR effectively accelerates model convergence and overall efficiency. Throughout the entire algorithm, the FreeRF module is fully utilized to further enhance feature extraction capability while reducing the parameter count of the network during the testing phase, achieving lightweight feature registration. Extensive experimentation has demonstrated that our EFRNet exhibits significant advantages over state-of-the-art feature registration methods on three challenging datasets. Importantly, our method achieves faster inference speeds across different sensor devices.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.