Intersectional Race-Gender Stereotypes in Natural Language

Research output: Contribution to journalArticlepeer-review

Abstract

How are Asian and Black men and women stereotyped? Research from the gendered race and stereotype content perspectives has produced mixed empirical findings. Using BERT models pre-trained on English language books, news articles, Wikipedia, Reddit, and Twitter, with a new method for measuring propositions in natural language (the Fill-Mask Association Test, FMAT), we explored the gender (masculinity–femininity), physical strength, warmth, and competence contents of stereotypes about Asian and Black men and women. We find that Asian men (but not women) are stereotyped as less masculine and less moral/trustworthy than Black men. Compared to Black men and Black women, respectively, both Asian men and Asian women are stereotyped as less muscular/athletic and less assertive/dominant, but more sociable/friendly and more capable/intelligent. These findings suggest that Asian and Black stereotypes in natural language have multifaceted contents and gender nuances, requiring a balanced view integrating the gender schema theory and the stereotype content model. Exploring their semantic representations as propositions in large language models, this research reveals how intersectional race–gender stereotypes are naturally expressed in real life.
Original languageEnglish
Pages (from-to)1771-1786
Number of pages16
JournalBritish Journal of Social Psychology
Volume63
Issue number4
DOIs
Publication statusPublished - 1 Oct 2024

Keywords

  • gender
  • intersectionality
  • race
  • stereotypes
  • natural language processing
  • large language models

Research Beacons, Institutes and Platforms

  • Global inequalities
  • Manchester China Institute

Fingerprint

Dive into the research topics of 'Intersectional Race-Gender Stereotypes in Natural Language'. Together they form a unique fingerprint.

Cite this