Widely used as fundamental security components in most cryptographic applications, random number generators (RNGs) rely mainly on randomness provided by entropy sources. If the provided randomness is less than expected, RNGs may be compromised and thus impair the security of the whole cryptographic applications. However, the common assumptions (e.g., outputs are independent and identically distributed, i.e., IID) may not always hold. For example, many entropy sources are based on some physical phenomena that are fragile and sensitive to external factors (e.g., temperature), which means the distributions of these entropy sources’ outputs are continuously changing. As important tools to measure the quality of entropy sources, existing entropy estimation methods may provide false estimations against these time-varying data, because they cannot detect the changes of data distributions. In this paper, we firstly review and analyze the existing typical entropy estimators including the NIST SP 800-90B (90B for short) estimators and the lately proposed neural network based (NN-based) estimators, especially, their limitations on the aforementioned time-varying data. Second, we propose an entropy estimation framework adopting change detection techniques to address this problem. In contrast to the NN-based estimators, the proposed estimator under this framework employs a change detection method to preprocess the tested data and adds additional distribution features to each data sample, which makes it possible to learn the distribution changes and estimate the entropy more accurately. Finally, we evaluate the performance of our estimator using various kinds of simulated data and real world data, and compare our estimator with the 90B estimators and the NN-based estimators. Extensive evaluations demonstrate that the proposed estimator provides similar or more accurate entropy estimation than the other estimators, especially for time-varying data.