Abstract

Even without visual feedback, humans can accurately determine the shape of objects on the basis of haptic feedback. This feat is achievable despite large variability in sensory and motor uncertainty in estimation of hand pose and object location. In contrast, most neuroprosthetic hands still operate unaware of the shape of the object they are manipulating and can thus only provide limited intelligence for natural control of the hand. We present a computational model for haptic exploration and shape reconstruction derived from mobile robotics: simultaneous localisation and mapping (SLAM). This approach solely relies on the knowledge of object contacts on the end-points, noisy sensory readings and motor control signals. We present a proof-of-principle accurate reconstruction of object shape (e.g. Rubik's cube) from single-finger exploration and propose a straightforward extension to a full hand model with realistic mechanical properties. The proposed framework allows for principled study of natural human haptic exploration and context-aware prosthetics. In conjunction with tactile-enabled prostheses, this could allow for online object recognition and pose adaptation for more natural prosthetic control.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call