Abstract
The sign language recognition system has placed an important role in disabled people’s lives. The researchers utilize various methods to fulfill disabled people’s requirements. However, the methods fail to access their sign at a reasonable cost with minimum computational difficulties. The improper sign access causes a reduction of the sign language recognition accuracy. Therefore, this study uses the sensor of wearable devices to capture people’s motions and actions to identify sign language. The collected information is fused into a competition level that understands disabled people’s requirements. After that, fused information is processed by the Pareto Optimized Hypertuned Deep Elman Neural Model (POHDENM). In addition, the data augmentation process is incorporated to train the data, which helps to recognize the large volume of sentences. Here, the fused information is split into words processed by the neural model that recognizes the sign language features. The extracted features are analyzed by an optimized method that recognizes the language with maximum accuracy. During the analysis, network parameters are hyper-tuned with the help of the Pareto model, which reduces the misrecognition error rate. Then, the created system efficiency is evaluated using the experimental analysis. The POHDENM approach achieves a high accuracy of 97.86% and an F1-score of 98.08%. The model’s high performance is achieved by fine-tuning its hyperparameters using the Pareto optimization algorithm, which balances precision of 98.06% and recall of 98.12%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.