Abstract

Deaf and mute people have unique communication and social challenges that make it hard to express their thoughts, needs, and ideas. Understanding people's behavior is more important to protect them and help them integrate into society. This study discusses the critical need for behavioral analysis on deaf and mute people and introduces the Automatic Behavioral Analysis Employing Gesture Detection Framework (ABA-GDF). Gesture detection technology has gained popularity recently. This emphasis may be due to its ability to overcome communication hurdles and illuminate nonverbal communication. Current methods have various challenges, including limited accuracy and adaptability. The ABA-GDF architecture comprises three phases: dataset collection, modeling, and deployment. The data collection technique includes hand signals used by deaf and quiet people. The material is then processed to partition and normalize the hand area for consistent analysis. During Modelling, feature descriptor attributes are developed to extract relevant motion information. A classifier learns and predicts using the feature vectors, enabling the framework to recognize and interpret motions and actions. Large-scale simulations of ABA-GDF showed promising results. The ABA-GDF framework achieved 92% gesture recognition accuracy on the dataset. The system's robustness is demonstrated by its capacity to understand non-verbal messages. The research showed a 15% reduction in false positives compared to earlier methods, demonstrating its real-world usefulness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call