At present, the application of remote sensing (RS) data achieved from satellite imagery or unmanned aerial vehicles (UAV) has become common for crop classification procedures, i.e. crop mapping, soil classification, or prediction of yield. The classification of food crop utilizing RS images (RSI) is one of the major applications of RS in farming. It contains the usage of aerial or satellite images for classifying and identifying dissimilar kinds of food crops developed in an exact region. This data is beneficial for estimation of yield, crop monitoring, and land management. Meeting the conditions for examining these data needs more refined techniques and artificial intelligence (AI) technologies, which deliver essential support. Recently, the usage of deep learning (DL) for crop type classification with RS images could help sustainable farming practices by providing appropriate and precise data on the kinds and features of crops. In this study, we offer an Automated Agricultural Crop Type Mapping Utilizing Fusion of Transfer Learning and Tasmanian Devil Optimization (AACTM-FTLTDO) algorithm on Remote Sensing Imagery. The primary goal of the AACTM-FTLTDO methodology is to accurately detect and classify crop types for more precise agricultural monitoring using remote sensing technologies. To accomplish that, the AACTM-FTLTDO model employs a fusion of transfer learning techniques involving three models such as SqueezeNet, CapsNet, and ShuffleNetV2 to capture diverse, multi-scale spatial and spectral features. For the crop type classification and detection process, the auto-encoder (AE) classifier can be employed. Eventually, the tasmanian devil optimization (TDO) technique was deployed to modify the hyperparameter of the AE technique for ensuring optimal model configurations and reducing computational complexity. A wide range of experimentation studies is made and the results are examined under numerous measures. The comparative study shows that the AACTM-FTLTDO technique performs better than existing approaches
Read full abstract