Abstract

This letter demonstrates the feasibility of near realtime, plenoptic-inertial navigation on a low-cost central processing unit (CPU). To enable real-time operation, a standard plenoptic camera was modeled as a system of stereo cameras and triangulation was used to estimate its pose from a minimal set of subaperture images. The relationship between distance and disparity for the simplified model was experimentally validated in an aquatic environment, using a first-generation Lytro camera, and a mean error of 2% of the target distance was obtained. This letter culminates with testing the proposed navigation system on an in-house developed, novel, biologically inspired, autonomous underwater vehicle (AUV), CephaloBot. The test consisted of the AUV rotating around a static object while maintaining a fixed separation distance. The mean position error from the test was 2.5% of the target distance. With the simplified plenoptic model, only 750 ms were required to process the raw plenoptic data and estimate position on an Intel i5 CPU. The processing delay was short enough that the delayed position measurements bounded the effects of sensor drift when fused with an inertial measurement unit using a delayed extended Kalman filter. This result demonstrates the feasibility of plenoptic-inertial navigation on a low-cost CPU.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call