Abstract

Within the next few years, personal robots are expected to enter our homes, offices, schools, hospitals, construction sites, and workshops. For these robots to play a successful role in people's professional and personal lives, they need to display the kind of efficient and satisfying interaction that humans are accustomed to from each other. Designing this human-robot interaction is a multifaceted challenge, balancing requirements of the robot's appearance and behavior. A robot's appearance evokes interaction affordances and triggers emotional responses; its behavior communicates internal states, and can support action coordination and joint planning. Good HRI design should enlist both facets to enable untrained humans to work fluently and intuitively with the robot. In this talk I will present the approach we have been using in the past decade to develop several non-anthropomorphic robotic systems. The underlying principles of both appearance and behavioral design are movement, timing, and embodiment, acknowledging that human perception is highly sensitive to spatial cues, physical movement, and visual affordances. We design our robots' appearance using techniques from 3D animation, sculpture, industrial, and interaction design. Gestures and behaviors drive decisions on the robot's appearance and mechanical design. Starting from freehand sketches, the robot's personality is built as a computer animated character, setting the parameters and limits of the robot's degrees of freedom. Then, material and form studies are combined with functional requirements to settle on the final system design. I will exemplify this process on the design of several robots. On the behavioral side, we design around the notion of human-robot fluency---the ability to accurately mesh the robot's activity with that of a human partner. I present computational architectures rooted in timing, joint action, and embodied cognition. Specifically, I discuss anticipatory action for collaboration, and a model of priming through perceptual simulation. Both systems have been shown to have significant effects on the fluency of a human-robot team, and on humans' perception of the robot's intelligence, commitment, and even gender. I then describe an interactive robotic improvisation system that uses embodied gestures for simultaneous, yet responsive, joint musicianship.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call