Abstract

Measuring the pose of non-cooperative targets in space is a critical supporting technology for cleaning up space debris and recovering items. However, most existing methods are simulation experiments conducted in good lighting environments and tend to show poor performance in dark lighting environments. A target pose measurement method based on binocular vision is proposed, which is suitable for dark lighting environments. First, the traditional features from accelerated segment test algorithm are improved to reduce the influence of illumination on the performance of feature point extraction under various postures. The point feature and line feature are combined to extract image features more easily in a dark lighting environment while retaining the high accuracy of the pose measurement algorithm based on point features. Second, the normalized cross-correlation coefficient matching method is combined with the epipolar constraint to narrow the search range of the matching points from the two-dimensional plane to the epipolar line, which substantially improves the matching efficiency and accuracy of the matching algorithm. Finally, post-processing through feature matching is performed to reduce the probability of mismatches. Simulation and physical experiment results show that our method can stably extract features and obtain high-precision target pose information in well-illuminated as well as dark lighting environments, making it suitable for high-precision target pose measurement under insufficient illumination.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.