Abstract

Despite progress toward gender equality in the labor market over the past few decades, gender segregation in labor force composition and labor market outcomes persists. Evidence has shown that job advertisements may express gender preferences, which may selectively attract potential job candidates to apply for a given post and thus reinforce gendered labor force composition and outcomes. Removing gender-explicit words from job advertisements does not fully solve the problem as certain implicit traits are more closely associated with men, such as ambitiousness, while others are more closely associated with women, such as considerateness. However, it is not always possible to find neutral alternatives for these traits, making it hard to search for candidates with desired characteristics without entailing gender discrimination. Existing algorithms mainly focus on the detection of the presence of gender biases in job advertisements without providing a solution to how the text should be (re)worded. To address this problem, we propose an algorithm that evaluates gender bias in the input text and provides guidance on how the text should be debiased by offering alternative wording that is closely related to the original input. Our proposed method promises broad application in the human resources process, ranging from the development of job advertisements to algorithm-assisted screening of job applications.

Highlights

  • Despite progress toward gender equality at work in recent years, gender segregation in the composition of the labor force remains and clear gender differences in labor market outcomes persist (Bertrand, 2020; England et al, 2020)

  • Gender-definite words and attribute words that seem gender-neutral are shown to contribute to signaling gender preference in job posts (Bem and Bem, 1973; Born and Taris, 2010)

  • Bias detection and evaluation in job text are usually done by targeting specific words that are more commonly associated with a specific gender, e.g., ambitious is usually considered masculine and considerate is usually considered feminine even though both words can be used to describe people of any gender

Read more

Summary

INTRODUCTION

Despite progress toward gender equality at work in recent years, gender segregation in the composition of the labor force remains and clear gender differences in labor market outcomes persist (Bertrand, 2020; England et al, 2020). Bias detection and evaluation in job text are usually done by targeting specific words that are more commonly associated with a specific gender, e.g., ambitious is usually considered masculine and considerate is usually considered feminine even though both words can be used to describe people of any gender. The goal of this research lies in removing gender stereotypes in gender-neutral words perceived by machine learning models and decoupling gender information from semantic information to avoid the incorrect association of attributes to gender due to the presence of gender stereotypes in the training corpus. This procedure allows the models to make predictions free of gender stereotypes.

Gender Bias in Job Advertisement
Bias Evaluation at the Text Level
METHODOLOGY
Quantifying Gender Bias by Words
Gender Bias Score at the Text Level
Bias Compensation
Constrained Density Fusion Let d = |G| denote the number of different word types
Estimate Importance Weight
APPLICATION
Bias Score
Debiasing
DISCUSSION
Findings
DATA AVAILABILITY STATEMENT
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call