Abstract

<p style="text-align:justify"><span style="font-size:12pt"><span style="line-height:200%"><span style="font-family:"Times New Roman","serif""><span lang="EN-US" style="line-height:200%">Floods</span><span style="color:black"> cause significant harm around the world each year. Predicting floods accurately and in a </span><span lang="EN-US" style="line-height:200%">timely manner can greatly reduce the loss of human life and property. Thus far, a number of modelling approaches have been described for automatic flood detection; these approaches primarily capture temporal dependences while ignoring patterns related to relative humidity, wind speed, and rainfall intensity—all of which are critical for flood prediction. This paper presents a novel prediction method by combining a Light weighted Dense network and Tree structural simple recurrent unit (LDTSRU). First, a light-weighted dense network is used to convert the input meteorological variables into grayscale images and identify any remarkable patterns between the variables. The nonlinear relationship between the input and output data is then automatically learned using the Tree structural simple recurrent unit (TS-SRU). It is also capable of efficiently comprehending the order and flow of events that culminate in a flood. The p</span><span style="color:black">ublic flood detection dataset is used to confirm the accuracy of the proposed model by comparing it with state-of-the-art methods. According to experimental findings, LDTSRU can perform better with less training time. LDTSRU has attained 2.53% higher average accuracy and better average precision and average recall compared to well-known state-of-the-art techniques.</span></span></span></span></p>

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.