One of the goals of AI-based computational pathology is to generate compact representations of whole slide images (WSIs) that capture the essential information needed for diagnosis. While such approaches have been applied to histopathology, few applications have been reported in cytology. Bone marrow aspirate cytology is the basis for key clinical decisions in hematology. However, visual inspection of aspirate specimens is a tedious and complex process subject to variation in interpretation, and hematopathology expertise is scarce. The ability to generate a compact representation of an aspirate specimen may form the basis for clinical decision-support tools in hematology. In this study, we leverage our previously published end-to-end AI-based system for counting and classifying cells from bone marrow aspirate WSIs, which enables the direct use of individual cells as inputs rather than WSI patches. We then construct bags of individual cell features from each WSI, and apply multiple instance learning to extract their vector representations. To evaluate the quality of our representations, we conducted WSI retrieval and classification tasks. Our results show that we achieved a mAP@10 of 0.58 ±0.02 in WSI-level image retrieval, surpassing the random-retrieval baseline of 0.39 ±0.1. Furthermore, we predicted five diagnostic labels for individual aspirate WSIs with a weighted-average F1 score of 0.57 ±0.03 using a k-nearest-neighbors (k-NN) model, outperforming guessing using empirical class prior probabilities (0.26 ±0.02). We present the first example of exploring trainable mechanisms to generate compact, slide-level representations in bone marrow cytology with deep learning. This method has the potential to summarize complex semantic information in WSIs toward improved diagnostics in hematology, and may eventually support AI-assisted computational pathology approaches.