Abstract
Timely, accurate maps of invasive plant species are critical for making appropriate management decisions to eliminate emerging target populations or contain infestations. High-resolution aerial imagery is routinely used to map, monitor, and detect invasive plant populations. While conventional image interpretation involving human analysts is straightforward, it can require high demands for time and resources to produce useful intelligence. We compared the performance of human analysts with a custom Retinanet-based deep convolutional neural network (DNN) for detecting individual miconia (Miconia calvescens DC) plants, using high-resolution unmanned aerial system (UAS) imagery collected over lowland tropical forests in Hawai’i. Human analysts (n = 38) examined imagery at three linear scrolling speeds (100, 200 and 300 px/s), achieving miconia detection recalls of 74 ± 3%, 60 ± 3%, and 50 ± 3%, respectively. The DNN achieved 83 ± 3% recall and completed the image analysis in 1% of the time of the fastest scrolling speed tested. Human analysts could discriminate large miconia leaf clusters better than isolated individual leaves, while the DNN detection efficacy was independent of leaf cluster size. Optically, the contrast in the red and green color channels and all three (i.e., red, green, and blue) signal to clutter ratios (SCR) were significant factors for human detection, while only the red channel contrast, and the red and green SCRs were significant factors for the DNN. A linear cost analysis estimated the operational use of a DNN to be more cost effective than human photo interpretation when the cumulative search area exceeds a minimum area. For invasive species like miconia, which can stochastically spread propagules across thousands of ha, the DNN provides a more efficient option for detecting incipient, immature miconia across large expanses of forested canopy. Increasing operational capacity for large-scale surveillance with a DNN-based image analysis workflow can provide more rapid comprehension of invasive plant abundance and distribution in forested watersheds and may become strategically vital to containing these invasions.
Highlights
Invasive species are one of the main threats to native ecosystems worldwide, altering plant community structure and function, i.e., reducing biodiversity and compromising ecosystem services [1,2,3,4]
Three factors were significant for efficacy of deep neural network (DNN) detection: CM,R, SCRR and SCRG (p < 0.05, magnitudes of η 2 ; Table 5), which were significant for the human detections
unmanned aerial system (UAS) imagery can provide valuable intelligence for natural resource managers, but the current bottleneck of time and human resources required to exhaustively search through these images reduces the scalability of this approach
Summary
Invasive species are one of the main threats to native ecosystems worldwide, altering plant community structure and function, i.e., reducing biodiversity and compromising ecosystem services [1,2,3,4]. Invasive species detection and control programs typically consume a significant portion of natural resource management budgets, and provide fertile ground for technological innovations to reduce costs by increasing efficiency in protecting large landscapes [5,6]. Research and development of emerging technologies has become an institutional component of invasive species management strategies in Hawai’i, as a measure to gain advantages on large, often expensive, problems [7]. Beyond that, naturalized invasive species populations often become established beyond feasible eradication and are relegated to containment strategies attempting to confine populations to their occupied areas [9]. There have been many technological advancements to this effort, starting with the advent of civilian GPS, geographical information systems (GIS), and remote sensing, leading to better spatial and temporal tracking of dynamic species invasions [12,13,14,15]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.