Abstract

Clothing plays a critical role in people’s daily lives, a perfect styling clothing is able to help people to avoid weaknesses and show their personal temperament, however, not everyone is good at styling. Compatibility is the core of styling, however, determining whether a pair of garments are compatible with each other is a challenging styling issue. Years of research have been devoted to fashion compatibility learning, whereas there are still several drawbacks in visual feature detection and compatibility calculation. In this paper, we propose an end-to-end framework to learn the compatibility among tops and bottoms. In order to improve the effects of visual feature extraction, a Multi-layer Non-local Feature Fusion framework (MNLFF) is developed. Feature fusion model is used to combine both high and low-level features, while non-local block is for global feature detection. We compare our technique with the prior state-of-the-art methods in the outfit compatibility prediction task and extensive experiments on existing datasets demonstrate its effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call