Abstract

With growing interest, an amalgamation of the field of Human-Computer Interaction and Speech Synthesis is redefining the way humans interact with technology by exploring several facets of interaction and thus incorporating it in synthesized speech. One such facet is laughter – an important form of interaction that expresses emotion like humor, joy, amusement as well as sarcasm. The most significant excitation of a laughter signal occurs at the instant of glottal closure, and determination of these instants, known as epochs, along with instantaneous fundamental frequency is used for deriving the excitation source information – a method known as Zero Frequency Filtering. In the existing literature, modification of the characteristics of the excitation source information along with the LP residuals and prosody that involves changing the pitch and duration of the signal, form the rudimentary step for synthesizing laughter calls. In order to emulate the naturalness, these modifications are carried out by first examining the features of natural laughter. This paper proposes a method to generate laughter signal by concatenation of laughter calls using mass-spring model, as laughter calls make up a laughter signal. Using the mass-spring model, a fair CMOS is achieved making the synthesized laughter sound more natural. The synthesized laughter can be used to improve the perceivability of a synthesized speech in Human-Computer Interaction

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.