Abstract

With the advent of new technology, traditional job interviews have been supplemented by asynchronous video interviews (AVIs). However, research on psychometric properties of AVIs is limited. In this study, 710 participants completed a mock AVI responding to eight personality questions (Extraversion, Conscientiousness). We collected self- and observer reports of personality, interview performance ratings, attractiveness, and AVI meta-information (e.g., professional attire, audio quality). Then, we automatically extracted the words, facial expressions, and voice characteristics from the videos and trained machine learning models to predict the personality traits and interview performance. Our algorithm explained substantially more variance in observer reports of Extraversion and Conscientiousness (average R2 = 0.32) and interview performance (R2 = 0.44), than self-reported Extraversion and Conscientiousness (average R2 = 0.12). Consistent with Trait Activation Theory, the explained variance in personality traits increased when participants responded to trait-relevant, compared to trait-irrelevant, questions. The test-retest reliability of our algorithm was somewhat stable over a time period of seven months, but lower than desired reliability standards in personnel selection. We examined potential sources of bias, including age, gender, and attractiveness, and found some instances of algorithmic bias (e.g., gender differences were often amplified in favor of women).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.