Abstract

The development of modules and routines into an existing server-based control station to handle approach and docking maneuvers of mobile systems is the central achievement of this work. Efficient material flow in a digitized factory environment requires that goods are automatically loaded and unloaded onto transport robots. For this purpose, accurate positioning of the robot to a loading station is necessary or to a charging station, to automatically charge the robot's energy storage. Apart from these applications, robots are furthermore increasingly used for assembly work in production facilities. The possibility of using a transport robot as a mobile montage platform at assembly stations with manipulators is investigated in this work. Crucial for this is the ability to repeatedly acquire a position of sub-centimeter accuracy relative to the station and additionally to transmit the precise final position to the station. The critical component here is the estimation of the mobile system's pose, which is estimated using camera images of artificial markers attached to the stations. Since the precision of a visual localization strongly depends on the detection accuracy of the markers, an optimization problem is formulated to approximate the edges of the markers using an edge model function and thus determine their corners more precisely. Finally, a planar continuous trajectory planning is implemented to determine the segments of a possible path between any two poses

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.