Abstract
In an era where the demand for efficient and practical machine learning (ML) solutions on resource-constrained devices is evergrowing, the realm of tiny machine learning (TinyML) emerges as a promising frontier. Motivated by the need for lightweight, low-power models that can be deployed on edge devices, this research paper presents an innovative TinyML model tailored to recognize Arabic hand gestures executed in mid-air. With a primary emphasis on the precise classification of Arabic numbers through these expressive hand movements, the paper unveils a comprehensive dataflow architecture. This intricate architecture processes accelerometer and gyroscope data to derive exact 2D gesture coordinates, a fundamental component of the recognition process. The cornerstone of the proposed model is the integration of Convolutional Neural Networks (CNNs), elucidating their exceptional role in achieving an impressive 93.8% accuracy rate in the classification of diverse Arabic Numbers gestures. This remarkable level of precision underscores the model's efficacy and resilience, rendering it an ideal candidate for real-time deployment in various gesture recognition scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.