In several superconducting applications, as, for example, in some supercondcuting generators, motors, and power transmission cables, the superconductor experiences a changing magnetic field in a DC background. Simulating the losses caused by this AC ripple field is an important task from the application design point of view. In this work, we compare two formulations, the H-formulation and the minimum magnetic energy variation-formulation, based on the eddy current model (ECM) and the critical state model (CSM), respectively, for simulating ripple field losses in a DC biased coated conductor tape. Furthermore, we compare our simulation results with measurements. We investigate the frequency-dependence of the hysteresis loss predictions of the power law based ECM and verify by measurements, that in DC use, ECM clearly over-estimates the homogenization of the current density profile in the coated conductor tape: the relaxation of the local current density is not nearly as prominent in the measurement as it is in the simulation. Hence, we suggest that the power law resistivity, used as the local relation between the electric field intensity E and current density J in ECM, is not an intrinsic property of high-temperature superconductors. The difference between the models manifests itself as discrepancies in ripple field loss simulations in very low AC fields with significant DC fields or currents involved. The results also show, however, that for many practical situations, CSM and ECM are both eligible models for ripple field loss simulations.