Abstract
This paper presents a contribution to Bayesian based sensor fusing in the context of human demonstration for compliant motion task specification, where sensor-controlled robot systems physically interact with the environment. One wants to learn about the geometric parameters of a task and segment the total motion executed during the human demonstration into subtasks for the robot. The motion of the human demonstration tool is sensed by measuring the position of multiple LED markers with a 3D camera, and the interaction with the environment is sensed with a force/torque sensor inside the demonstration tool. All measurements are uncertain, and do not give direct information about the geometric parameters of the contacting surfaces, or about the contact formations encountered during the human demonstration. The paper uses a Bayesian sequential Monte Carlo method (also known as a particle filter) to simultaneously estimate the contact formation (discrete information) and the geometric parameters (continuous information), where different measurement models link the information from heterogeneous sensors to the hybrid unknown parameters. The simultaneous contact formation segmentation and the geometric parameter estimation are helped by the availability of a contact state graph of all possible contact formations. The presented approach applies to all compliant motion tasks involving polyhedral objects with a known geometry, where the uncertain geometric parameters are the poses of the objects. The approach has been verified in real world experiments, in which it is able to discriminate in realtime between some 250 different contact formations in the graph
Paper version not known (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have