Abstract

Cultural items (e.g., songs, books, and movies) have an important impact in creating and reinforcing stereotypes. But the actual nature of such items is often less transparent. Take songs, for example. Are lyrics biased against women, and how have any such biases changed over time? Natural language processing of a quarter of a million songs quantifies gender bias in music over the last 50 years. Women are less likely to be associated with desirable traits (i.e., competence), and while this bias has decreased, it persists. Ancillary analyses further suggest that song lyrics may contribute to shifts in collective attitudes and stereotypes toward women, and that lyrical shifts are driven by male artists (as female artists were less biased to begin with). Overall, these results shed light on cultural evolution, subtle measures of bias and discrimination, and how natural language processing and machine learning can provide deeper insight into stereotypes, cultural change, and a range of psychological questions more generally. (PsycInfo Database Record (c) 2023 APA, all rights reserved).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call