Abstract

AbstractIn health care, a telepresence robot could be used to have a clinician or a caregiver assist seniors in their homes, without having to travel to these locations. However, the usability of these platforms for such applications requires that they can navigate and interact with a certain level of autonomy. For instance, robots should be able to go to their charging station in case of low energy level or telecommunication failure. The remote operator could be assisted by the robot’s capabilities to navigate safely at home and to follow and track people with whom to interact. This requires the integration of autonomous decision-making capabilities on a platform equipped with appropriate sensing and action modalities, which are validated out in the laboratory and in real homes. To document and study these translational issues, this article presents such integration on a Beam telepresence platform using three open-source libraries for integrated robot control architecture, autonomous navigation and sound processing, developed with real-time, limited processing and robustness requirements, so that they can work in real-life settings. Validation of the resulting platform, named SAM, is presented based on the trials carried out in 10 homes. Observations made provide guidance on what to improve and will help identify interaction scenarios for the upcoming usability studies with seniors, clinicians and caregivers.

Highlights

  • In health care, a telepresence robot could be used to have a clinician or a caregiver assist seniors in their homes, without having to travel to these locations

  • To facilitate the use of ODAS on various robotic platforms, we provide as open hardware two sound cards [57]: 8SoundsUSB6 and 16SoundsUSB,7 for 8 and 16 microphone arrays, respectively

  • The sketches provided in Appendix A are approximate representations of the real homes

Read more

Summary

Introduction

Abstract: In health care, a telepresence robot could be used to have a clinician or a caregiver assist seniors in their homes, without having to travel to these locations. The remote operator could be assisted by the robot’s capabilities to navigate safely at home and to follow and track people with whom to interact. This requires the integration of autonomous decisionmaking capabilities on a platform equipped with appropriate sensing and action modalities, which are validated out in the laboratory and in real homes. To document and study these translational issues, this article presents such integration on a Beam telepresence platform using three open-source libraries for integrated robot control architecture, autonomous navigation and sound processing, developed with real-time, limited processing and robustness requirements, so that they can work in real-life settings. Observations made provide guidance on what to improve and will help identify interaction scenarios for the upcoming usability studies with seniors, clinicians and caregivers

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.