Abstract. Earth observation of complex scenes, such as coastal fringes, is based on a plethora of optical sensors constrained by trade-offs between spatial, spectral, temporal and radiometric resolution. The spaceborne hyperspectral EO-1 Hyperion sensor (decommissioned in 2017) was able to acquire imagery with 10 nm spectral (220 bands) at 30 m spatial resolutions over 1424.5 km2 scenes. Conversely, the widespread unmanned airborne vehicle (UAV) hyperspatial DJI Mavic Pro camera can collect only natural-coloured imagery of 100 nm spectral (3 bands) but at 0.1 m spatial resolution over ∼10 km2 scenes (with a single battery and calm meteo-marine conditions). The spaceborne WorldView-3 (WV3), featured by 60 nm spectral (16 bands) at 0.3 m spatial resolution (when pansharpened) over 1489.6 km2 scenes, has the capacity to bridge both sensors. This study aims at testing the spectral and spatial performances of the WV3 to discriminate 10 complex coastal classes, ranging from ocean, reefs and terrestrial vegetation in Moorea Island (French Polynesia). Our findings show that geometrically- and radiometrically-corrected 0.3-m 16-band WV3 bands competed with (30-m) 167-band Hyperion performance for classifying 10 coastal classes with 2-neuron artificial neural network modelling, while being able to segment objects seized by 0.1-m (3-band) UAV. Unifying superspectral and hyperspatial specificities, the WV3 also leverages hypertemporal resolution, that is to say 1-day temporal resolution, rivalling UAV’s one.
Read full abstract