Abstract
Calculating the Normalized Difference Vegetation Index (NDVI) requires expensive multispectral cameras, posing challenges such as high costs and the need for technical expertise. This research aims to develop a method to transform RGB images obtained from Unmanned Aerial Vehicle (UAV) into NDVI images. Our proposed method leverages the CycleGAN model, an unsupervised image-to-image translation framework, to learn the intricate mapping between RGB values and NDVI. The model was trained on unpaired datasets of RGB and NDVI images, sourced from paddy fields located in Gresik and Yogyakarta, Indonesia. This process successfully encapsulated the complex correlation between these two modalities. Various training strategies were systematically investigated, including weight initialization schemes, fine-tuning procedures, and learning rate policies, to optimize the model's performance. The fine-tuned CycleGAN demonstrated superior performance in creating synthetic NDVI images from unpaired dataset, surpassing other methods in terms of fidelity, quality, and structural coherence. The results were impressive, with a Normalized Root Mean Square Error (NRMSE) of 0.327, a Peak Signal-to-Noise Ratio (PSNR) of 16.330, an Oriented FAST and Rotated BRIEF (ORB) score of 0.859, and a Structural Similarity Index (SSIM) of 0.757. The best performing CycleGAN model was then deployed on a low-spec microcomputer device, specifically the Raspberry Pi 4B with an average computation time of 21.0077 seconds. Raspberry Pi 4B was chosen for its lightweight, compact dimensions, and compatibility with efficient battery power connections, making it suitable for drone deployment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal on Advanced Science, Engineering and Information Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.