Abstract

In this study, moisture contents and product quality of Pu‐erh tea were predicted with deep learning‐based methods. Images were captured continuously in the sun‐drying process. Environmental parameters (EP) of air humidity, air temperature, global radiation, wind speed, and ultraviolet radiation were collected with a portable meteorological station. Sensory scores of aroma, flavor, liquor color, residue, and total scores were given by a trained panel. Convolutional neural network (CNN) and gated recurrent unit (GRU) models were constructed based on image information and EP, which were selected in advance using the neighborhood component analysis (NCA) algorithm. The evolved models based on deep‐learning methods achieved satisfactory results, with RMSE of 0.4332, 0.2669, 0.7508 (also with R 2 of .9997, .9882, .9986, with RPD of 53.5894, 13.1646, 26.3513) for moisture contents prediction in each batch of tea, tea at different sampling periods, the overall samples, respectively; and with RMSE of 0.291, 0.2815, 0.162, 0.1574, 0.3931 (also with R 2 of .9688, .9772, .9752, .9741, .8906, with RPD of 5.6073, 6.5912, 6.352, 6.1428, 4.0045) for final quality prediction of aroma, flavor, liquor color, residue, total score, respectively. By analyzing and comparing the RMSE values, the most significant environmental parameters (EP) were selected. The proposed combinations of different EP can also provide a valuable reference in the development of a new sun‐drying system.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.