Abstract

This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter‐writer characteristics. Even when all explicit gender‐identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.