Abstract
Interior style classification is an interesting problem which has potential applications both commercial and academic communities. This task aims to devise interior design styles automatically. Thus, interior designers will explore customers’ tastes and then precisely provide suggestions for decor inspiration based on their preferences. Recently, Convolutional Neural Networks (CNNs) have been considered the de-facto standard in computer vision tasks. Therefore, several current works have tended to address interior style classification using CNN-based architectures. Moreover, transformer-based architectures and attention-based encoder–decoder models have been proven successfully in computer vision and natural language processing tasks. Sequentially, more studies have been arguing the efficiency of combining CNN-based architectures and transformer-based architectures for normal image classification problems. In this project, we focus on finding an architecture network that is suitable for the interior style classification problem. We propose a robustness method to address interior style design classification, named ISC-DeIT. The proposed method is based on Data-efficient image transformer architectures and knowledge distillation, which can be trained on small datasets effectively. Especially, a proposed additional module is plugged to leverage learning feature representations for improving predictive accuracy. Experiments were carried out on a new curated dataset with five interior styles including Art-Decor, Hitech, Indochina, Industrial, and Scandinavian. Empirical results of ISC- DeiT indicated that the ability of prediction for interior style classification of the proposed method has been increased significantly, compared with other state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.