Abstract
Pixel art is a unique art style with the appearance of low resolution images. In this paper, we propose a data-driven pixelization method that can produce sharp and crisp cell effects with controllable cell sizes. Our approach overcomes the limitation of existing learning-based methods in cell size control by introducing a reference pixel art to explicitly regularize the cell structure. In particular, the cell structure features of the reference pixel art are used as an auxiliary input for the pixelization process, and for measuring the style similarity between the generated result and the reference pixel art. Furthermore, we disentangle the pixelization process into specific cell-aware and aliasing-aware stages, mitigating the ambiguities in joint learning of cell size, aliasing effect, and color assignment. To train our model, we construct a dedicated pixel art dataset and augment it with different cell sizes and different degrees of anti-aliasing effects. Extensive experiments demonstrate its superior performance over state-of-the-arts in terms of cell sharpness and perceptual expressiveness. We also show promising results of video game pixelization for the first time. Code and dataset are available at https://github.com/WuZongWei6/Pixelization.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.