Abstract

Recent upward trends in acres irrigated have been linked to increasing near-surface moisture. Unfortunately, stations with dew point data for monitoring near-surface moisture are sparse. Thus, models that estimate dew points from more readily observed data sources are useful. Daily average dew temperatures were estimated and evaluated at 14 stations in Southwest Georgia using linear regression models and artificial neural networks (ANN). Estimation methods were drawn from simple and readily available meteorological observations, therefore only temperature and precipitation were considered as input variables. In total, three linear regression models and 27 ANN were analyzed. The two methods were evaluated using root mean square error (RMSE), mean absolute error (MAE), and other model evaluation techniques to assess the skill of the estimation methods. Both methods produced adequate estimates of daily averaged dew point temperatures, with the ANN displaying the best overall skill. The optimal performance of both models was during the warm season. Both methods had higher error associated with colder dew points, potentially due to the lack of observed values at those ranges. On average, the ANN reduced RMSE by 6.86% and MAE by 8.30% when compared to the best performing linear regression model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.