Non-invasive crop phenotyping is essential for crop modeling, which relies on image processing techniques. This research presents a plant-scale vision system that can acquire multispectral plant data in agricultural fields. This paper proposes a sensory fusion method that uses three cameras, Two multispectral and a RGB depth camera. The sensory fusion method applies pattern recognition and statistical optimization to produce a single multispectral 3D image that combines thermal and near-infrared (NIR) images from crops. A multi-camera sensory fusion method incorporates five multispectral bands: three from the visible range and two from the non-visible range, namely NIR and mid-infrared. The object recognition method examines about 7000 features in each image and runs only once during calibration. The outcome of the sensory fusion process is a homographic transformation model that integrates multispectral and RGB data into a coherent 3D representation. This approach can handle occlusions, allowing an accurate extraction of crop features. The result is a 3D point cloud that contains thermal and NIR multispectral data that were initially obtained separately in 2D.
Read full abstract