Reservoir characterization is an inverse problem where parameter values associated to the properties of the porous media, are estimated. In this problem, the pressure data and its log-derivative curves are used in the fitting process. However, the numerical differentiation needed to generate the log-derivative curve, is an ill-posed problem where the noise in the data can be largely propagated. This noise can produce several spurious local minima in the objective function and prevents to get a precise approximation to the parameters. Therefore, an efficient noise reduction method is required to achieve the desired accuracy in the parameter values.In this work, we explore three noise reduction methods by applying them to well-test data. These methods are based on multi-step finite differences and splines, but using a machine-learning approach.In addition, a new method is proposed which produce an optimal curve which is a linear combination of multiple data curves, that are approximations to the solution of the inverse problem.We show how these methods which use information from the forward model are more effcient for noise reduction. The proposed methods can be used in many inverse problems where there is a mathematical model meant to describe the measured data.