Abstract

We developed a GPU-accelerated 2D physically based distributed rainfall runoff model for a PC environment. The governing equations were derived from the diffusive wave model for surface flow and the Horton infiltration model for rainfall loss. A numerical method for the diffusive wave equations was implemented based on a Godunov-type finite volume scheme. The flux at the computational cell interface was reconstructed using the piecewise linear monotonic upwind scheme for conservation laws with a van Leer slope total variation diminishing limiter. Parallelization was implemented using CUDA-Fortran with an NVIDIA GeForce GTX 1060 GPU. The proposed model was tested and verified against several 1D and 2D rainfall runoff processes with various topographies containing depressions. Simulated hydrographs, water depth, and velocity were compared to analytical solutions, dynamic wave modeling results, and measurement data. The diffusive wave model reproduced the runoff processes of impermeable basins with results similar to those of analytical solutions and the numerical results of a dynamic wave model. For ideal permeable basins containing depressions such as furrows and ponds, physically reasonable rainfall runoff processes were observed. From tests on a real basin with complex terrain, reasonable agreement with the measured data was observed. The performance of parallel computing was very efficient as the number of grids increased, achieving a maximum speedup of approximately 150 times compared to a CPU version using an Intel i7 4.7-GHz CPU in a PC environment.

Highlights

  • The prediction and control of floodwater have attracted significant public and scientific interest for many years

  • For floods induced by rainfall events, the timescale of runoff processes typically varies from minutes to days depending on the characteristics of rainfall and basins

  • Conceptual models typically require additional processes called trial-and-error procedures because they rely on numerous parameters, where a considerable portion of the parameters are not physical, but empirical [1]

Read more

Summary

Introduction

The prediction and control of floodwater have attracted significant public and scientific interest for many years. Conceptual rainfall runoff models typically require relatively little computational resources and can predict flood patterns almost instantly. They have been widely used in practice. It takes a very long time for physically based distributed models to compute rainfall runoff events because these models are typically composed of huge numbers of computational grids in space and time To handle such problems, parallel computing has been widely developed and applied in many scientific and engineering fields. We propose a GPU-accelerated diffusive wave model for simulating rainfall runoff processes in a PC environment. The proposed model is applied to several benchmark tests and the performance results are reported

Diffusive Wave Model
Infiltration Model
Numerical Scheme
GPU Environment and Programming
Moving Storms on 1D Impermeable Overland Plane
Comparisons of of computed analytical solutions
Complex Natural Basin
GPU Speedup of Parallel Computing
10 CPU presents the computation timesthe of computation the GPU andtimes
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call