Abstract

Planning and managing urban forests for livable cities remains a challenge worldwide owing to sparse information on the spatial distribution, structure and composition of urban trees and forests. National and municipal sources of tree inventory remain limited due to a lack of detailed, consistent and frequent inventory assessments. Despite advancements in research on the automation of urban tree mapping using Light Detection and Ranging (LiDAR) or high-resolution satellite imagery, in practice most municipalities still perform labor-intensive field surveys to collect and update tree inventories. We present a robust, affordable and rapid method for creating tree inventories in any urban region where sufficient street-level imagery is readily available. Our approach is novel in that we use a Mask Regional Convolutional Neural Network (Mask R-CNN) to detect and locate separate tree instances from street-level imagery, thereby successfully creating shape masks around unique fuzzy urban objects like trees. The novelty of this method is enhanced by using monocular depth estimation and triangulation to estimate precise tree location, relying only on photographs and images taken from the street. Experiments across four cities show that our method is transferable to different image sources (Google Street View, Mapillary) and urban ecosystems. We successfully detect >70% of all public and private trees recorded in a ground-truth campaign across Metro Vancouver. The accuracy of geolocation is also promising. We automatically locate public and private trees with a mean error in the absolute position ranging from 4 to 6 m, which is comparable to ground-truth measurements in conventional manual urban tree inventory campaigns.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.