Abstract

Monitoring aquatic environment is of great interest to the ecosystem, marine life, and human health. This article presents the design and implementation of Samba—an aquatic surveillance robot that integrates an off-the-shelf Android smartphone and a robotic fish to monitor harmful aquatic processes such as oil spills and harmful algal blooms. Using the built-in camera of the smartphone, Samba can detect spatially dispersed aquatic processes in dynamic and complex environment. To reduce the excessive false alarms caused by the nonwater area (e.g., trees on the shore), Samba segments the captured images and performs target detection in the identified water area only. However, a major challenge in the design of Samba is the high energy consumption resulted from continuous image segmentation. We propose a novel approach that leverages the power-efficient inertial sensors on smartphones to assist image processing. In particular, based on the learned mapping models between inertial and visual features, Samba uses real-time inertial sensor readings to estimate the visual features that guide image segmentation, significantly reducing the energy consumption and computation overhead. Samba also features a set of lightweight and robust computer vision algorithms, which detect harmful aquatic processes based on their distinctive color features. Last, Samba employs a feedback-based rotation control algorithm to adapt to spatiotemporal development of the target aquatic process. We have implemented a Samba prototype and evaluated it through extensive field experiments, lab experiments, and trace-driven simulations. The results show that Samba can achieve a 94% detection rate, a 5% false alarm rate, and a lifetime up to nearly 2 months.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call