Traditional leaf chlorophyll estimation using Soil Plant Analysis Development (SPAD) devices and spectrophotometers is a high-cost mechanism in agriculture. Recently, research on chlorophyll estimation using leaf camera images and machine learning has been seen. However, these techniques use self-defined image color combinations where the system performance varies, and the potential utility has not been well explored. This paper proposes a new method that combines an improved contact imaging technique, the images’ original color parameters, and a 1-D Convolutional Neural Network (CNN) specifically for tea leaves’ chlorophyll estimation. This method utilizes a smartphone and flashlight to capture tea leaf contact images at multiple locations on the front and backside of the leaves. It extracts 12 different original color features, such as the mean of RGB, the standard deviation of RGB and HSV, kurtosis, skewness, and variance from images for 1-D CNN input. We captured 15,000 contact images of tea leaves, collected from different tea gardens across Assam, India to create a dataset. SPAD chlorophyll measurements of the leaves are included as true values. Other models based on Linear Regression (LR), Artificial Neural Networks (ANN), Support Vector Regression (SVR), and K-Nearest Neighbor (KNN) were also trained, evaluated, and tested. The 1-D CNN outperformed them with a Mean Absolute Error (MAE) of 2.96, Mean Square Error (MSE) of 15.4, Root Mean Square Error (RMSE) of 3.92, and Coefficient of Regression (R2) of 0.82. These results show that the method is a digital replication of the traditional method, while also being non-destructive, affordable, less prone to performance variations, and simple to utilize for sustainable agriculture.
Read full abstract