Abstract
Public speaking is an essential skill in a large variety of professions and also in everyday life. However, it can be difficult to master. This paper focuses on the automatic assessment of nonverbal facial behavior during public speaking and proposes simple and efficient methods of head pose estimation and motion analysis. The authors collected nine and six speech videos from a recitation and oration contest, respectively, conducted at a Japanese high school and applied the proposed method to evaluate the contestants' performance. For the estimation of head pose from speech videos, their method produced results with an acceptable level of accuracy. The proposed motion analysis method can be used for calculating frequencies and moving ranges of head motion. The authors found that the proposed parameters and the eye-contact score are strongly correlated and that the proposed frequency and moving range parameters are suitable for evaluating public speaking. Thus, on the basis of these features, a teacher can provide accurate feedback to help a speaker improve.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.