Deep Auto-Encoder (DAE) is a promising deep learning model to extract the features of data. However, it is difficult to achieve the desired network parameters by training due to the issue of inappropriate learning rate, which usually affects its performance of extracting features. In order to solve such an issue, in this paper we propose a new Variable Learning Speed DAE (VLSDAE) using Multiscale Reconstruction Errors (MRE) and Weights Update Correlation (WUC) to adaptively adjust its learning rate. Specifically, the reconstruction errors based on the multiscales, i.e. mini-batch, epoch and phase, are considered to reflect the macroscopic training effect. Moreover, for the sake of further representing the microscopic training state, WUC is proposed to depict the training effect of each neuron. We first give the complexity analysis and the theoretical proof for the convergence of VLSDAE. To further exhibit its performance, a parameter sensitivity analysis is then provided to investigate the effects of two key parameters. In addition, the proposed MRE, WUC and our integrated method are studied individually to reveal their learning effects. Finally, we compare VLSDAE with eight state-of-the-art algorithms. The experimental results demonstrate that VLSDAE outperforms the other eight competitors in terms of training error, classification accuracy, precision, recall and macro-F1.