Abstract We need comprehensive information to manage and protect biodiversity in the face of global environmental challenges, and artificial intelligence is required to generate that information from vast amounts of biodiversity data. Currently, vision‐based monitoring methods are heterogenous; they poorly cover spatial and temporal dimensions, overly depend on humans, and are not reactive enough for adaptive management. To mitigate these issues, we present a portable, modular, affordable and low‐power device with embedded vision for biodiversity monitoring of a wide range of terrestrial taxa. Our camera uses interchangeable lenses to resolve barely visible and remote targets, as well as customisable algorithms for blob detection, region‐of‐interest classification and object detection to automatically identify them. We showcase our system in six use cases from ethology, landscape ecology, agronomy, pollination ecology, conservation biology and phenology disciplines. Using the same devices with different setups, we discovered bats feeding on durian tree flowers, monitored flying bats and their insect prey, identified nocturnal insect pests in paddy fields, detected bees visiting rapeseed crop flowers, triggered real‐time alerts for waterfowl and tracked flower phenology over months. We measured classification accuracies (i.e. F1‐scores) between 55% and 95% in our field surveys and used them to standardise observations over highly resolved time scales. Our cameras are amenable to situations where automated vision‐based monitoring is required off the grid, in natural and agricultural ecosystems, and in particular for quantifying species interactions. Embedded vision devices such as this will help addressing global biodiversity challenges and facilitate a technology‐aided agricultural systems transformation.
Read full abstract