Predicting runoff coefficient (Rc), as an indicator of the catchment’s response to the rainfall-runoff process, remains a persistent challenge using different modelling techniques, especially in catchments with strong human manipulation. This study investigates the efficiency of the Long Short-Term Memory (LSTM) method in predicting Rc for the Rur catchment, in Germany. The period from 1961 to 2021 is considered, which is subject to human intervention and significant urbanization especially in the northern part of the catchment. An LSTM structure is defined by employing inputs at a monthly resolution including temperature, precipitation, soil water storage, and total evaporation with a look-back window of 1 to 6 months to model noisy Rc data of the study area. Two approaches using either undecomposed or decomposed Rc were employed in conjunction with the LSTM method, to mitigate the impact of noise associated with Rc. The results show that in the case of undecomposed Rc, the best performance of the LSTM structure was obtained with a 4-month look-back window, yielding Nash-Sutcliffe efficiency (NSE) of 0.55, 0.46, and 0.15 for training, validation, and test sets, respectively. These results highlight inadequate accuracy in accounting for the presence of noise in Rc. Therefore, in the second novel approach, we used maximal overlap discrete wavelet transform (MODWT) to decompose the Rc up to level 3 to reduce the complexity and distribute the noise effects across each level. The new approach showed high accuracy in modelling noisy data of Rc with NSE values of 0.97, 0.95, and 0.90 for training, validation, and test sets, respectively. The obtained results underscore the pivotal role of decomposition techniques in conjunction with LSTM to account for the presence of noise, especially in catchments with strong human manipulation.