Abstract
Future space robot generations will replace astronauts in deep space missions and routine operations. They will use tools, and perform assembly, disassembly and handling tasks for maintenance purposes. A key feature of autonomous, task-level commandable robots is a valid and complete representation of an application's task space for planning and optimizing a rough action sequence facing a sensorially classified situation. This paper shows such a representation for a robot-based automated material science experiment set-up. It proposes a method of analysis by which a valid and complete task space model can be obtained. Results of practical experiments with a terrestrial laboratory mock-up using the novel representation scheme are presented.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have