Abstract
SUMMARYRecognition of human emotions is a basic requirement in many real‐time applications. Detection of exact emotions through voice provides relevant information for various purposes. Several computational methods have been employed for the clear analysis of human emotions. Most of the previous approaches face complexities due to certain drawbacks like degraded signal quality, a requirement of high storage space, increased computational complexity, and deteriorated outcomes of classification accuracy. The proposed work was implemented to gather the accurate classification result of embedded emotions and minimize the computational complexities of MDDTRNN (modified deep duck and traveler recurrent neural network). The proposed work includes four steps: preprocessing, feature extraction, feature selection, and classification. In feature extraction, the spectral and frequency features are extracted using the adopting boosted MFCC (Mel frequency cepstral coefficients) method to improve training speed. In feature selection, the best features are selected using an algorithm of AAVOA (adaptive African vulture optimization algorithm). To provide optimal emotion results, the classification step is undertaken by the MDDTRNN technique. The proposed work shows better classification outcomes of emotions when compared to the existing approaches by holding the accuracy of (95.86%), precision as (93.79%), specificity as (94.28%), sensitivity as (92.89%) and the error rate is attained to be 5.266 in terms of IEMOCAP dataset. The accuracy result (96.27%), precision (94.83%), specificity (93.16%), sensitivity (94%) and the error rate is achieved to be 4.982 in terms of the EMODB dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Adaptive Control and Signal Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.