Abstract

Automated detection and classification of sewer defects can complement the conventional labor-intensive sewer inspection process by providing an essential tool to classify sewer defects in a more efficient, accurate, and consistent way. This paper presents a convolutional neural networks (CNNs)–based model to automatically detect and classify six most commonly observed sewer defects (i.e., cracks, disjoints, obstacles, residential walls, tree roots, and normal categories) obtained from multisource CCTV images under various circumstances. Data augmentation techniques (including geometric and color transformations) are applied to enhance the model performance. The proposed CNN model is further compared with a state-of-the-art solution (retraining the SqueezeNet using defect images) by adopting transfer learning technique. An average prediction accuracy of 90% is achieved, indicating that the investigated defects can be well recognized by the model without any expert knowledge of sewer detection. There is a higher degree of confidence in predicting tree roots and disjoints, followed by residential walls and cracks. Results show that the prediction accuracy has increased by 15% thanks to data augmentation. Despite the transferred SqueezeNet model achieved a higher accuracy (95%), it cost almost 13 times the computation time of the CNN model. The study demonstrates the feasibility of the deep learning technology in the automated classification of sewer defects and advances the knowledge in the research field.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.