Abstract
Convolutional Neural Networks are usually fitted with manually labelled data. The labelling process is very time-consuming since large datasets are required. The use of external hardware may help in some cases, but it also introduces noise to the labelled data. In this paper, we pose a new data labelling approach by using bootstrapping to increase the accuracy of the PeTra tool. PeTra allows a mobile robot to estimate people’s location in its environment by using a LIDAR sensor and a Convolutional Neural Network. PeTra has some limitations in specific situations, such as scenarios where there are not any people. We propose to use the actual PeTra release to label the LIDAR data used to fit the Convolutional Neural Network. We have evaluated the resulting system by comparing it with the previous one—where LIDAR data were labelled with a Real Time Location System. The new release increases the MCC-score by 65.97%.
Highlights
This paper presents a comparative study of fitting a Convolutional Neural Networks (CNNs) with data labelled by bootstrapping versus data labelled with an external ground-truth system
This work compares the performance of the CNN used by the People Tracking (PeTra) tool to track people close to a mobile service robot
The CNN has been fitted twice: the first time by using training data labelled with the KIO Real Time Location System (RTLS) device, and the second time by using the PeTra tool itself to label the training data
Summary
Especially social or assistive robots, coexist with people in the environment where they are deployed. Such robots need to be able to carry out some basic tasks. They need to know their position in the environment They have to move from one point to another autonomously, avoiding obstacles and without damaging people or objects. They interact with people and they even work with them on specific tasks. The first two skills have been extensively studied and developed in the literature, today there are quite robust solutions. The third one is a slightly more complex skill and many studies are currently focusing on it
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.