Abstract

The naive video stylization method is to perform neural-style transfer on individual frames, but this method would result in a flickering effect, which is particularly visible in static regions. Previous remedies extract optical flow from a video and use this information to stabilize the stylized videos. However, computing optical flow is complex and time-consuming. We consider stylizing videos in which the background is fixed and only the foreground object moves, which is the case in video calls. We propose a simple method to stylize such videos in real time based on frame difference. The main idea is to use the frame difference to detect foreground and rebuild it in the next frame while maintaining the stylized background from the previous frame. This method is easy to implement and can stylize videos in real time with stabilized frames.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.