Abstract
Activities of Daily Living (ADLs), especially activities around the head such as drinking, feeding, shaving and oral care, are of great importance for an independent quality of life. Assistive robots for ADLs are developed to perform tasks that the disabled person would require frequent need for a caregiver. These assistive robotic systems are either manually controlled or controlled via a shared control scheme, thus are not fully autonomous. This paper proposes an autonomous assistive robotic system to perform drinking task for disabled people. The system employs a UR-10 6-DOF manipulator to convey a cup of drink to the user’s mouth and perform autonomously the drinking task in a nature manner. A Kinect sensor is used for online face and mouth detection and head pose tracking, recognition of the cup region of interest and the drink level. The robot trajectory planning and control are done such that the cup is kept upright and is conveyed at a constant speed to the user’s detected mouth point. This makes use of the 3D data acquired from image processing and is online updated for trajectory replanning when the head and mouth pose unintentionally change. Also, during drinking, the cup is continuously reoriented to keep the horizontal drink level at the mouth point and to ensure that the cup remains in contact with the user’s mouth. Simulation results prove that the proposed system can perform the drinking assistance autonomously.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.