Accurate and unbiased classification of breast lesions is pivotal for early diagnosis and treatment, and a deep learning approach can effectively represent and utilize the digital content of images for more precise medical image analysis. Breast ultrasound imaging is useful for detecting and distinguishing benign masses from malignant masses. Based on the different ways in which benign and malignant tumors affect neighboring tissues, i.e., the pattern of growth and border irregularities, the penetration degree of the adjacent tissue, and tissue-level changes, we investigated the relationship between breast cancer imaging features and the roles of inter- and extra-lesional tissues and their impact on refining the performance of deep learning classification. The novelty of the proposed approach lies in considering the features extracted from the tissue inside the tumor (by performing an erosion operation) and from the lesion and surrounding tissue (by performing a dilation operation) for classification. This study uses these new features and three pre-trained deep neuronal networks to address the challenge of breast lesion classification in ultrasound images. To improve the classification accuracy and interpretability of the model, the proposed model leverages transfer learning to accelerate the training process. Three modern pre-trained CNN architectures (MobileNetV2, VGG16, and EfficientNetB7) are used for transfer learning and fine-tuning for optimization. There are concerns related to the neuronal networks producing erroneous outputs in the presence of noisy images, variations in input data, or adversarial attacks; thus, the proposed system uses the BUS-BRA database (two classes/benign and malignant) for training and testing and the unseen BUSI database (two classes/benign and malignant) for testing. Extensive experiments have recorded accuracy and AUC as performance parameters. The results indicate that the proposed system outperforms the existing breast cancer detection algorithms reported in the literature. AUC values of 1.00 are calculated for VGG16 and EfficientNet-B7 in the dilation cases. The proposed approach will facilitate this challenging and time-consuming classification task.