Abstract

In this study, a robot arm positioning method based on projection transformation is proposed for the science and engineering applications of a lunar rover in the ChangE-3 project, and is presented in three parts. First, a spherical projection model for a fish-eye camera was transformed to a perspective projection model to handle the original fish-eye images and convert them to epipolar images. Second, correlation matching and least- square methods were used to match two epipolar images at sub-pixel level. Third, the target-point coordinates were calculated by forward intersection, and the normals to the target surface were obtained by fitting the local surface to the surrounding points. To validate the method, an experiment on image-based point positioning was conducted in a simulated lunar environment. The experimental results showed that this method can completely meet the required detection accuracy for the ChangE-3 lunar rover.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call