Abstract
Recent developments in mmWave technology allow the detection and classification of dynamic arm gestures. However, achieving a high accuracy and generalization requires a lot of samples for the training of a machine learning model. Furthermore, in order to capture variability in the gesture class, the participation of many subjects and the conduct of many gestures with different arm speed are required. In case of macro-gestures, the position of the subject must also vary inside the field of view of the device. This would require a significant amount of time and effort, which needs to be repeated in case that the sensor hardware or the modulation parameters are modified. In order to reduce the required manual effort, here we developed a synthetic data generator that is capable of simulating seven arm gestures by utilizing Blender, an open-source 3D creation suite. We used it to generate 600 artificial samples with varying speed of execution and relative position of the simulated subject, and used them to train a machine learning model. We tested the model using a real dataset recorded from ten subjects, using an experimental sensor. The test set yielded 84.2% accuracy, indicating that synthetic data generation can significantly contribute in the pre-training of a model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.