With ever-progressing development period, image classification algorithms based on deep learning have shown good performance on some large datasets. In the development of classification algorithms, many proposals related to attention mechanism have greatly improved the accuracy of the model, and at the same time increased the interpretability of the network structure. However, on medical image data, the performance of the classification algorithm is not as expected, and the reason is that the fine-grained image data differs little among all classes, resulting that the knowledge domain is also hard to learn for models. We (1) proposed the Efficientnet model based on the cbam attention mechanism, and added a multi-scale fusion method; (2) applied the model to the breast cancer medical image data set, and completed the breast cancer classification task with high accuracy (Phase I, Phase II, Phase III, etc.); (3) Compared with other existing image classification algorithms, our method has the highest accuracy, thus the researchers conclude that EfficientNet with CBAM and multi-scale fusion will improve the classification performance. This result is helpful for deeper research on medical image processing and breast cancer staging.