Purpose: To assess the performance of using transferred features from pre-trained deep convolutional networks (CNNs) in the task of classifying cancer in breast ultrasound images, and to compare this method of transfer learning with previous methods involving human-designed features. Methods: A breast ultrasound dataset consisting of 1125 cases and 2393 regions of interest (ROIs) was used. Each ROI was labeled as cystic, benign, or malignant. Features were extracted from each ROI using pre-trained CNNs and used to train support vector machine (SVM) classifiers in the tasks of distinguishing non-malignant (benign+cystic) vs malignant lesions and benign vs malignant lesions. For a baseline comparison, classifiers were also trained on prior analytically-extracted tumor features. Five-fold cross-validation (by case) was conducted with the area under the receiver operating characteristic curve (AUC) as the performance metric. Results: Classifiers trained on CNN-extracted features were comparable to classifiers trained on human-designed features. In the non-malignant vs malignant task, both the SVM trained on CNN-extracted features and the SVM trained on human-designed features obtained an AUC of 0.90. In the task of determining benign vs malignant, the SVM trained on CNN-extracted features obtained an AUC of 0.88, compared to the AUC of 0.85 obtained by the SVM trained on human-designed features. Conclusion: We obtained strong results using transfer learning to characterize ultrasound breast cancer images. This method allows us to directly classify a small dataset of lesions in a computationally inexpensive fashion without any manual input. Modern deep learning methods in computer vision are contingent on large datasets and vast computational resources, which are often inaccessible for clinical applications. Consequently, we believe transfer learning methods will be important for computer-aided diagnosis schemes in order to utilize advancements in deep learning and computer vision without the associated costs. This work was partially funded by NIH grant U01 CA195564 and the University of Chicago Metcalf program. M.L.G. is a stockholder in R2/Hologic, co-founder and equity holder in Quantitative Insights, and receives royalties from Hologic, GE Medical Systems, MEDIAN Technologies, Riverain Medical, Mitsubishi, and Toshiba. K.D. received royalties from Hologic.