In a Hilbert setting, we develop a gradient-based dynamic approach for fast solving convex optimization problems. By applying time scaling, averaging, and perturbation techniques to the continuous steepest descent (SD), we obtain high-resolution ordinary differential equations of the Nesterov and Ravine methods. These dynamics involve asymptotically vanishing viscous damping and Hessian-driven damping (either in explicit or implicit form). Mathematical analysis does not require developing a Lyapunov analysis for inertial systems. We simply exploit classical convergence results for SD and its external perturbation version, then use tools of differential and integral calculus, including Jensen’s inequality. The method is flexible, and by way of illustration, we show how it applies starting from other important dynamics in optimization. We consider the case in which the initial dynamic is the regularized Newton method, then the case in which the starting dynamic is the differential inclusion associated with a convex lower semicontinuous potential, and finally we show that the technique can be naturally extended to the case of a monotone cocoercive operator. Our approach leads to parallel algorithmic results, which we study in the case of fast gradient and proximal algorithms. Our averaging technique shows new links between the Nesterov and Ravine methods. Funding: The research of R.I. Boţ and D.-K. Nguyen was supported by the Austrian Science Fund (FWF), projects W 1260 and P 34922-N.