Abstract

In order to improve human bone and joint data, we propose a method to collect data and judge the standard of motion. Kinect is a 3D somatosensory camera released by Microsoft. It has three cameras in total. The middle is a color camera, which can take color images and obtain 30 images per second; on the left is the infrared projector, which irradiates the object to form speckle. On the right is the depth camera to analyze the infrared spectrum. On both sides are two depth sensors to detect the relative position of people. On both sides of Kinect are a set of quaternion linear microphone arrays for speech recognition and filtering background noise, which can locate the sound source. There is also a base with built-in motor below, which can adjust the elevation angle. It can not only complete the collection of color images, but also measure the depth information of objects. The experimental results show that we use MSRAction3D data set and compare the same cross-validation method with other latest research methods in the figures. The highest recognition rate of this method (algorithm 10) is the second, and the lowest and average recognition rates are the highest. The improvement in the lowest recognition rate is obvious, which can show that this method has good recognition performance and better stability than other research methods. Kinect plays a relatively important role in the movement of human bone and joint data acquisition.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.