Abstract

<p>A multitude of research has been rising for predicting the behavior of different real-world problems through machine learning models. An erratic nature occurs due to the augmented behavior and inadequacy of the prerequisite dataset for the prediction of water level over different fundamental models that show flat or low-set accuracy. In this paper, a powerful scaling strategy is proposed for improvised back-propagation algorithm using parallel computing for groundwater level prediction on graphical processing unit (GPU) for the Faridabad region, Haryana, India. This paper aims to propose the new streamlined form of a back-propagation algorithm for heterogeneous computing and to examine the coalescence of artificial neural network (ANN) with GPU for predicting the groundwater level. twenty years of data set from 2001-2020 has been taken into consideration for three input parameters namely, temperature, rainfall, and water level for predicting the groundwater level using parallelized backpropagation algorithm on compute unified device architecture (CUDA). This employs the back-propagation algorithm to be best suited to reinforce learning and performance by providing more accurate and fast results for water level predictions on GPUs as compared to sequential ones on central processing units (CPUs) alone.</p>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.