Abstract

The digital factory provides undoubtedly great potential for future production systems in terms of efficiency and effectivity. A key aspect on the way to realize the digital copy of a real factory is the understanding of complex indoor environments on the basis of three-dimensional (3D) data. In order to generate an accurate factory model including the major components, i.e., building parts, product assets, and process details, the 3D data that are collected during digitalization can be processed with advanced methods of deep learning. For instance, the semantic segmentation of a point cloud enables the identification of relevant objects within the environment. In this work, we propose a fully Bayesian and an approximate Bayesian neural network for point cloud segmentation. Both of the networks are used within a workflow in order to generate an environment model on the basis of raw point clouds. The Bayesian and approximate Bayesian networks allow us to analyse how different ways of estimating uncertainty in these networks improve segmentation results on raw point clouds. We achieve superior model performance for both, the Bayesian and the approximate Bayesian model compared to the frequentist one. This performance difference becomes even more striking when incorporating the networks’ uncertainty in their predictions. For evaluation, we use the scientific data set S3DIS as well as a data set, which was collected by the authors at a German automotive production plant. The methods proposed in this work lead to more accurate segmentation results and the incorporation of uncertainty information also makes this approach especially applicable to safety critical applications aside from our factory planning use case.

Highlights

  • Published: 4 January 2021A three-dimensional (3D) model of factory buildings and inventory, as well as the simulation of process steps, play a major role in different planning domains

  • We propose a fully Bayesian and an approximate Bayesian neural network for point cloud segmentation

  • To the best of our knowledge, no other work has treated the topic of uncertainty estimation and Bayesian training of 3D segmentation networks that operate on raw and unordered point clouds without a previous transformation into a regular format

Read more

Summary

Introduction

A three-dimensional (3D) model of factory buildings and inventory, as well as the simulation of process steps, play a major role in different planning domains. We present a novel Bayesian 3D point cloud segmentation framework that is based on PointNet [6], which is able to capture uncertainty in network predictions. We use an entropy based interpretation of uncertainty in the network outputs and distinguish between overall, data related, and model related uncertainty These types of uncertainty are called predictive, aleatoric, and epistemic uncertainty, respectively [8]. To the best of our knowledge, no other work has treated the topic of uncertainty estimation and Bayesian training of 3D segmentation networks that operate on raw and unordered point clouds without a previous transformation into a regular format.

Literature Review
Bayesian Deep Learning and Uncertainty Quantification
Model Descriptions
Frequentist PointNet
Approximate Bayesian PointNet
Bayesian PointNet
Uncertainty Estimation
Data Sets
Stanford Large-Scale 3D Indoor Spaces Data Set
Automotive Factory Data Set
Results and Analysis
Segmentation Accuracy
Discussion and Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.