The ordinary differential equation (ODE) models of optimization methods allow for concise proofs of convergence rates through discussions based on Lyapunov functions. The weak discrete gradient (wDG) framework discretizes ODEs while preserving the properties of convergence, serving as a foundation for deriving optimization methods. Although various optimization methods have been derived through wDG, their properties and practical applicability remain underexplored. Hence, this study elucidates these aspects through numerical experiments. Particularly, although wDG yields several implicit methods, we highlight the potential utility of these methods in scenarios where the objective function incorporates a regularization term.