Recently, deep learning models proliferate in the prediction of power demand for efficient planning of power consumption. However, the “black-box” characteristics of deep learning hinders from establishing a specific plan because it cannot explain the cause of the prediction. Recently, there are several attempts to explain the result of deep learning through the analysis of the input attributes that influence the prediction, but they lack of appropriate explanation because of ignoring the time-series property of the input data. In this paper, we propose a deep learning model to explain the impact of the input attributes on the prediction by taking account of the long-term and short-term properties of the time-series forecasting. The model consists of (i) two encoders to represent the power information for prediction and explanation, (ii) a decoder to predict the power demand from the concatenated outputs of encoders, and (iii) an explainer to identify the most significant attributes for predicting the energy consumption. Kullback–Leibler divergence in the loss function induces the long-term and short-term dependencies in latent space constructed by the second encoder. Several experiments on the benchmark dataset of household electric energy demand show that the proposed method explains the prediction appropriately with the most influential input attributes in the long-term and short-term dependencies. We can trade off the gain of the time-series explanation of the result against a slight degradation of the prediction performance.