Abstract

Deep learning has gained popularity across several industries, including object recognition and classification. In the case of Convolutional Neural Networks (CNN), the first layers extract the most noticeable elements, such as shape and margin. As the model progresses, it learns to extract more complex features such as texture and color; conversely, skeleton features encompass significant locations (joints) that do not naturally align with the grid-like architecture intended for these networks. This study emphasizes the importance of structural features in enhancing the performance of deep learning models. It introduces the Gesture Analysis Module Network (GAMNet), which computes abstract structural values within the architecture for feature extraction, prioritization, and classification. These values go through a rigorous evaluation process along with the cutting-edge deep learning model, CNN, and result in intermediate representations, leading to better performance in gesture analysis. An automated dance gesture identification system can address the challenges of recognizing hand movements in unpredictable lighting, varied backgrounds, noise, and changing camera angles. Despite these challenges, GAMNet performed remarkably well, surpassing renowned models like VGGNet, ResNet, EfficientNet, and CNN, achieving a classification accuracy of 96.80%, even in challenging image circumstances. This paper highlights how GAMNet can revolutionize the world of classical Indian dance, opening up new opportunities for research and development in this field.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.