Abstract

Virtual makeup systems allow users to try makeup on remotely, without wasting products or spending time cleaning up later. Some virtual makeup systems just show the final makeup result on the user's face, while other systems allow some interaction when applying makeup on a photo. In this paper we introduce an augmented reality system that allows users to apply virtual makeup directly on their face using a physical applicator, simulating a virtual mirror experience. Facial features are detected and tracked using an RGBD camera and mapped to a normalized 2D facial mesh composed of 124 triangles. Finger touches on the face are also detected on the RGBD video stream, and used to store the applied makeup texture representation on the corresponding 2D facial triangle. Rendering the face with the virtual makeup is performed by back-projecting the makeup stored on the facial mesh to the image captured by the camera. Our initial prototype demonstrates the feasibility of our technique, that detects touches very accurately (about 2.2 mm), and that achieves real-time interactive performance (about 15 fps) when tracking and rendering makeup using a regular PC and an Intel RealSense RGBD camera.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.