Abstract

Building owners are working on converting their legacy documentation 2D floor plans into digital 3D representations, but the manual process is labor-intensive and time-consuming. In this paper, deep learning is leveraged to automate the process. This automation requires interoperability between artificial neural networks and prevailing 3D modeling software. The system processes 2D floor plans and outputs parameters of recognized walls, single doors, double doors, and columns. The parameters include the start point and end point of the wall and the center point of the door and column. These parameters are input into Revit 2022 through the Revit API 2022 after post-processing. The dimensional parameter integration affordances of object detection and instance segmentation are studied and compared using Faster R-CNN and Mask R-CNN models. Instance segmentation was found to require more time for data labeling but was more capable of informing the modeling of irregularly shaped objects. The mean Average Precision (mAP) of object detection and instance segmentation are 71.7% and 69.3%, respectively. Apart from single doors, the average precision for other categories falls within the range of 74% to 96%. The results provide software developers with guidance on choosing between object detection and instance segmentation strategies for processing legacy building documents. These types of systems are anticipated to be pivotal to the industry’s transition from 2D to 3D information modalities and advise practitioners to carefully choose suitable models and consider the recommendations provided in this study to mitigate potential failure cases.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.