Abstract

Natural image matting algorithms aim to predict the transparency map (alpha-matte) with the trimap guidance. However, the production of trimap often requires significant labor, which limits the widespread application of matting algorithms on a large scale. To address the issue, we propose Matte Anything model (MatAny), an interactive natural image matting model that could produce high-quality alpha-matte with various simple hints. The key insight of MatAny is to generate pseudo trimap automatically with contour and transparency prediction. In our work, we leverage vision foundation models to enhance the performance of natural image matting. Specifically, we use the segment anything model to predict high-quality contour with user interaction and an open-vocabulary detector to predict the transparency of any object. Subsequently, a pre-trained image matting model generates alpha mattes with pseudo trimaps. MatAny is the interactive matting algorithm with the most supported interaction methods and the best performance to date. It consists of orthogonal vision models without any additional training. We evaluate the performance of MatAny against several previous image matting algorithms. MatAny has 58.3% improvement on MSE and 40.6% improvement on SAD compared to the previous image matting methods with simple guidance, achieving new state-of-the-art (SOTA) performance. The source codes and pre-trained models are available at https://github.com/hustvl/Matte-Anything.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.