Abstract

Increased mobile autonomy is a vital requisite for future planetary exploration rovers. Stereo vision is a key enabling technology in this regard, as it can passively reconstruct in three dimensions the surroundings of a rover and facilitate the selection of science targets and the planning of safe routes. Nonetheless, accurate dense stereo algorithms are computationally demanding. When executed on the low-performance, radiation-hardened CPUs typically installed on rovers, slow stereo processing severely limits the driving speed and hence the science that can be conducted in situ . Aiming to decrease execution time while increasing the accuracy of stereo vision embedded in future rovers, this article proposes HW/SW co-design and acceleration on resource-constrained, space-grade FPGAs. In a top-down approach, we develop a stereo algorithm based on the space sweep paradigm, design its parallel HW architecture, implement it with VHDL, and demonstrate feasible solutions even on small-sized devices with our multi-FPGA partitioning methodology. To meet all cost, accuracy, and speed requirements set by the European Space Agency for this system, we customize our HW/SW co-processor by design space exploration and testing on a Mars-like dataset. Implemented on Xilinx Virtex technology, or European NG-MEDIUM devices, the FPGA kernel processes a 1,120 × 1,120 stereo pair in 1.7s−3.1s, utilizing only 5.4−9.3 LUT6 and 200−312 RAMB18. The proposed system exhibits up to 32× speedup over desktop CPUs, or 2,810× over space-grade LEON3, and achieves a mean reconstruction error less than 2cm up to 4m depth. Excluding errors exceeding 2cm (which are less than 4% of the total), the mean error is under 8mm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.