Abstract

This paper presents a novel system for personalized design of footwear using commercial depth-sensing technologies. The system allows users to virtually try on 3D shoe models in a live video stream. A two-stage object tracking algorithm was developed to correctly align shoe models to moving feet during the try-on process. Tracking was driven by an iterative closest point (ICP) algorithm that superimposed the captured depth data and predefined reference foot models. Test data showed that the two-stage approach produced increased positional accuracy compared with tracking using only surface registration. Trimming the reference model using the instant view angle improved the computational efficiency of the ICP algorithm. The developed virtual try-on function is an effective tool for realizing human-centered design. This study also showed a new application of RGB-D Cameras to product design.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.