Abstract
Nowadays, smart warehouses mostly use Automated Guided Vehicles (AGVs) controlled through magnetic or painted paths. This approach is suitable for “static slotting” warehouses, and for places where humans do not cross paths with mobile robots. Therefore, fixed-path AGVs are not an optimal solution for dynamic slotting “chaotic storage” warehouses, where picking and delivery paths are often changing. Hence, it is important to create an environment where AGVs have planned their path, and storekeepers can see their paths, and mark restricted areas by virtual means if needed, for these mobile robots and humans to move and stand safely around a smart warehouse. In this paper, we have proposed an Augmented Reality (AR) environment for storekeepers, where they can see an AGV planned path, and they can add virtual obstacles and walls to the mobile robots’ cyber-physical navigation view. These virtual obstacles and walls can be used to determine restricted areas for mobile robots, which can be seen for example as safe areas for humans’ and/or robots’ stationary work. Finally, we introduce the system architecture supporting the proposed AR environment for humans-mobile robots safe and productive interaction.
Highlights
According to [1], in a “traditional warehouse”, the operations of pickup, delivery, and bookkeeping are accomplished by storekeepers
We introduce an Augmented Reality (AR) solution for Humans-Robots (i.e. Automated Guided Vehicles (AGVs) and drones) safe and productive Interaction (HRI) in dynamic slotting “chaotic storage” smart warehouses, where these can communicate in a bidirectional way by means of AR mobile/wearable devices at humans, and computer vision at AGVs and drones
The AR-Humans-Robots Interaction (HRI) system architecture design consists of three separate, but interrelated applications: (i) a Core app installed in a desktop computer, and responsible for creating the “AR smart warehouse view” for the humans and for the mobile robots; (ii) the AR app installed in a smart mobile phone, or in smart-glasses, allowing the humans, the storekeepers, to view any virtual object added in the smart warehouse cyber-physical space in their perception/view of the real-world; and (iii) the Robot app installed in an AGV or drone, which uses computer vision to allow the mobile robots to have in this case a cyber-physical view of the smart warehouse as they move around it
Summary
According to [1], in a “traditional warehouse”, the operations of pickup, delivery, and bookkeeping are accomplished by (human) storekeepers. The research work and technological development described in this paper focuses on the possibilities of using AR mobile/wearable devices at humans (e.g. the Augmented & Collaborative Operator 4.0 [11]), and computer vision at mobile robots such as AGVs and drones (e.g. Human-Machine Interfaces (HMIs) 4.0 [12]), in the context of dynamic slotting “chaotic storage” smart warehouses. This in order to facilitate humans and mobile robots safe interaction and visual (control) communication. In the sub-sections, we introduce the proposed AR-HRI system architecture design and provide the needed guidelines for its implementation
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.