Abstract
In this article, an autonomous robotic fish is designed for underwater operations like object detection and tracking along with collision avoidance. The computer-aided design model for prototype robotic fish is designed using the Solid Works® software to export an stereolithography (STL) file to MakerBot, a 3D printer, to manufacture the parts of robotic fish using polylactic acid thermoplastic polymer. The precise maneuverability of the robotic fish is achieved by the propulsion of a caudal fin. The oscillation of the caudal fin is controlled by a servomotor. A combination of visual and ultrasonic sensors is used to track the position and distance of the desired object with respect to the fish and also to avoid the obstacles. The robotic fish has the ability to detect an object up to a distance of 90 cm at normal exposure conditions. A computational fluid dynamics analysis is conducted to analyze the fluid hydrodynamics (flow rate of water and pressure) around the hull of a robotic fish and the drag force acting on it. A series of experimental results have shown the effectiveness of the designed underwater robotic fish.
Highlights
In recent years, bioinspired underwater vehicles have become a significantly hot research topic in the field of ocean engineering
Fish-like robots have attracted the attention of the research community because of its great advantages over conventional propeller-driven underwater robots, such as high efficiency, extreme swiftness, and station-holding ability.[1]
The primary water test of the robotic fish has been conducted in the laboratory by using an experimental water tank
Summary
In recent years, bioinspired underwater vehicles have become a significantly hot research topic in the field of ocean engineering. The maximum pressure recorded is 136.950 Pa at the region where the water stagnates while the minimum is À274.544 Pa. For the autonomous underwater operation, the robotic fish must be capable of detecting hurdles in the path promptly, make a quick and satisfactory decision, and adopt a suitable path to bypass these hurdles and to get precise navigation.[31,32] The object is detected through a vision-based Pixy CMUcam[5] sensor along with the ultrasonic sensor. If the desired object comes in front, the position of robotic fish will be changed according to the information as the x-position of the object which is getting from an image sensor having a hue and saturation-based algorithm to recognize the color and size (block) of the dummy object
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have