Recent developments in numerous scientific computing fields have been made possible by physics-informed neural networks (PINNs), a class of machine learning techniques that incorporate physical knowledge into neural networks. However, PINNs still encounter difficulties in simulating a kind of partial differential equations (PDEs), i.e. gradient flow equations of complex dynamic systems. Through analysis from the neural tangent kernel's (NTK) point of view, we find that directly training PINNs on the large temporal window may lead to failure. To tackle this problem, we utilize one of the gradient flow equations' significant physical properties, the energy dissipation law, to propose an adaptive energy-based sequential training method for PINNs. Specifically, the gradient of free energy is employed to determine the adaptive temporal steps of the training window, which also enhances the training efficiency. Numerical experiments are demonstrated among gradient flow equations including Cahn-Hilliard, heat conduction, and Allen-Cahn equations. Our method not only outperforms PINNs but also preserves the energy dissipation law, which shows the ability to solve complex long-time simulation problems of gradient flow systems.