Abstract

This paper introduces the results of novel theoretical and practical studies aimed at providing automatic and accurate real-time activation and adjustment of shape-changing robots in accord to the shape of the body of the user. The proposed method consists of scanning, classifying the instances according to gender and size, performing analysis on both the user’s body and the prospective garment, which is be virtually fitted, modelling, extracting measurements and assigning reference points on them, segmenting the 3D visual data imported from the shape-changing robot, and finally, superimposing, adopting and depicting the resulting garment model on the user’s body. The estimation process of the positions of the moving actuators for adjusting the shape-changing robots tries to determine which input values could result in the closest representation of the desired sizes and distances through devising the mathematical description of a map relating them to each other. In order to classify the data obtained by the 3D scanner, first maximum likelihood function is used for selecting one of the shape-changing robots, according to the presumed gender and size, to be activated, and subsequently, support vector machine is utilized so as to find out which shape template from the dictionary best matches the scanning instance being considered. As a use case, the proposed method is applied to the visual data obtained by scanning Fits.me’s shape-changing robots using 3D laser scanner. The methods currently used are manual, whereas the proposed method is automatic and the experimental results show that it is the accurate and reliable.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.