Personalized outfit compatibility learning is an emerging yet challenging task. Most of the existing methods focus on general outfit compatibility learning. Although a few works have been proposed for personalized fashion compatibility, they either considered user preference on fashion items with specific patterns or design elements or recommended outfits based on the overall visual similarity according to the users’ preferred collections. This paper adopts physical and fashion attributes for effective personalized fashion compatibility evaluation and recommendation. The physical attributes are concluded into seven aspects: body shape, skin color, hairstyle, hair color, height, breast size (breasts), and color contrast. The personalized outfit compatibility problem in this paper is a multi-label classification problem and formulated as an optimization function with outfit images, fashion attributes, and physical attributes as input. It is the first attempt to solve the problem by discovering the correlation between visual image features, fashion attributes, and physical attributes. Specifically, the correlation is learned with two transformer encoders by updating attention weights of different embedding pairs during the training process. The model can not only predict the fashion attributes of the outfit’s top, bottom, shoes, and bag items, but also predict the incompatible physical attributes of an individual towards the given outfit. It can be used to recommend outfits that best fit an individual and the predicted fashion attributes can be used for result explanation. The O4U dataset, which contains rich annotations of fashion item attributes and human physical attributes of the outfits, is used to evaluate the performance of the proposed method. The quantitative and qualitative results show that the proposed method outperforms state-of-the-art methods for personalized outfit compatibility evaluation.