Abstract

For a long time, robot assembly programming has been produced in two environments: on-line and off-line. On-line robot programming uses the actual robot for the experiments performing a given task; off-line robot programming develops a robot program in either an autonomous system with a high-level task planner and simulation or a 2D graphical user interface linked to other system components. This paper presents a whole hand interface for more easily performing robotic assembly tasks in the virtual tenvironment. The interface is composed of both static hand shapes (states) and continuous hand motions (modes). Hand shapes are recognized as discrete states that trigger the control signals and commands, and hand motions are mapped to the movements of a selected instance in real-time assembly. Hand postures are also used for specifying the alignment constraints and axis mapping of the hand-part coordinates. The basic virtual-hand functions are constructed through the states and modes developing the robotic assembly program. The assembling motion of the object is guided by the user immersed in the environment to a path such that no collisions will occur. The fine motion in controlling the contact and ending position/orientation is handled automatically by the system using prior knowledge of the parts and assembly reasoning. One assembly programming case using this interface is described in detail in the paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call