Abstract
In semi-automated manufacturing, an increasing amount of intelligent mobile robots operate in close proximity to human workers. Considering future positions of humans allows to further improve the efficiency in terms of throughput of Autonomous Mobile Robots (AMR) and Automated Guided Vehicles (AGV). The longer the prediction horizon, i.e. the more position values of humans can be predicted in the near and distant future, the more a robot can adjust its route accordingly and optimize the process. This paper discusses the challenges of human motion trajectory prediction in manufacturing and presents a schedule-based approach that uses real-time schedule data obtained from Manufacturing Execution Systems (MES). Schedule-awareness in human motion trajectory prediction extends semantic mapping approaches and effectively reduces the number of probable destinations by considering which process steps are next for the currently produced goods. With a reduced set of destinations, the performance of forward-planning trajectory prediction can be improved. For evaluation, a commercial MES is used together with an Ultra-wideband-based Real-Time Locating System (RTLS) for obtaining position data of humans. On this basis, a naive Bayes classifier utilizes MES-schedule and real-time position data to predict human motion intentions. Abstract activity modeling ensures that only a few training data sets are required for deployment, thus making this approach suitable for rapidly changing manufacturing environments such as in flexible manufacturing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.