Modern livestock farm technologies allow operators to have access to a multitude of data thanks to the high number of mobile and fixed sensors available on both the livestock farming machinery and the animals. These data can be consulted via PC, tablet, and smartphone, which must be handheld by the operators, leading to an increase in the time needed for on-field activities. In this scenario, the use of augmented reality smart glasses could allow the visualization of data directly in the field, providing for a hands-free environment for the operator to work. Nevertheless, to visualize specific animal information, a connection between the augmented reality smart glasses and electronic animal identification is needed. Therefore, the main objective of this study was to develop and test a wearable framework, called SmartGlove that is able to link RFID animal tags and augmented reality smart glasses via a Bluetooth connection, allowing the visualization of specific animal data directly in the field. Moreover, another objective of the study was to compare different levels of augmented reality technologies (assisted reality vs. mixed reality) to assess the most suitable solution for livestock management scenarios. For this reason, the developed framework and the related augmented reality smart glasses applications were tested in the laboratory and in the field. Furthermore, the stakeholders’ point of view was analyzed using two standard questionnaires, the NASA-Task Load Index and the IBM-Post Study System Usability Questionnaire. The outcomes of the laboratory tests underlined promising results regarding the operating performances of the developed framework, showing no significant differences if compared to a commercial RFID reader. During the on-field trial, all the tested systems were capable of performing the task in a short time frame. Furthermore, the operators underlined the advantages of using the SmartGlove system coupled with the augmented reality smart glasses for the direct on-field visualization of animal data.
Read full abstract