ABSTRACT A multi-scale attention-guided deep learning model is proposed to characterise breast tissue in digital histology images (H&E stained) according to four different histological types including normal, benign, in situ carcinoma and invasive carcinoma. The framework includes two parallel convolutional neural networks with modified VGG16 architecture. The first network analyzes the whole-sample images at low magnification. The second network focuses on the patches extracted from the whole-sample images at high magnification. In the low-magnification network, a global average pooling layer was added at the end of the network to extract the class activation maps for the attention model. A long short-term memory network was adapted as a recurrent attention mechanism to increase the contribution of the relevant parts of each image for classification. In the high-magnification network, the probability vectors were averaged over all patches extracted from an image to obtain the probability vectors associated with the four histological types for each sample. The probability vectors for each sample from the high-magnification network and the attention model were fused using a multilayer perceptron network to generate a classification label. Obtained results on an independent test set demonstrated an average accuracy of 97.5% ± 1.0% for the proposed model. An average accuracy of 94.5%, 93.5%, and 96.3% was obtained, respectively, for the separate high- and low-magnification networks, and the multi-scale model without an attention mechanism. The results suggested that a multi-scale strategy coupled with an attention mechanism can improve the accuracy of deep learning models in classifying digital histology images.