Abstract

3D data is becoming increasingly popular and accessible for computer vision tasks. A popular format for 3D data is the mesh format, which can depict a 3D surface accurately and cost-effectively by connecting points in the (x, y, z) plane, known as vertices, into triangles that can be combined to approximate geometrical surfaces. However, mesh objects are not suitable for standard deep learning techniques due to their non-euclidean structure. We present an algorithm which predicts the sex, age, and body mass index of a subject based on a 3D scan of their face and neck. This algorithm relies on an automatic pre-processing technique, which renders and captures the 3D scan from eight different angles around the x-axis in the form of 2D images and depth maps. Subsequently, the generated data is used to train three convolutional neural networks, each with a ResNet18 architecture, to learn a mapping between the set of 16 images per subject (eight 2D images and eight depth maps from different angles) and their demographics. For age and body mass index, we achieved a mean absolute error of 7.77 years and 4.04 kg/m2 on the respective test sets, while Pearson correlation coefficients of 0.76 and 0.80 were obtained, respectively. The prediction of sex yielded an accuracy of 93%. The developed framework serves as a proof of concept for prediction of more clinically relevant variables based on 3D craniofacial scans stored in mesh objects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.