Abstract

The COVID-19 pandemic (hereinafter “the pandemic”) necessitated social distancing measures and limited physical contact, prompting the exploration of alternative methods for tasks like object delivery. Mobile service robots emerged as a potential solution, offering a bridge between humans and various tasks. While existing techniques have been introduced to enable robots to deliver objects in an end-to-end manner, they come with limitations. Grippers, for instance, can deliver only one object per round, cabinet robots require manual speed tuning to keep the object in place, and object holders lack generalizability. Inspired by the idea of human nature to use a tray to deliver the object, we developed the Visual-Based Adaptive Interaction System (hereinafter “VAIS”), a novel learning system, to improve service delivery using visual information and a fast neural learning mechanism. Within this system, the robot learns the optimal angular rotational and linear translational moving speeds to effectively transport objects placed on a tray without an extra holder. The robot validates these learnt movements by successfully completing multiple-object delivery tasks along designated routes. The results exhibit that the robot can utilize online learning after a few attempts to determine its proper moving speed and deliver different objects to a given location.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call