Examining and mitigating gender bias in text emotion detection task

Odbal Odbal, Guanhong Zhang, Sophia Ananiadou

Research output: Contribution to journalArticlepeer-review


Gender bias is an important problem that affects models of natural language, and the propagation of such biases could be harmful. Much research focuses on gender biases in word embeddings, and there are also some works on gender biases in subsequent tasks. However, very limited prior work has been done on gender issues in emotion detection tasks. In this paper, we investigate the effect of gender in text emotion detection. Existing methods for gender biases require gender balanced and gender-swapping data, and might influence the performance of the target task due to removing more information related to sensitive attributes. We present different solutions to measuring and mitigating gender bias in emotion detection. To measure gender bias, we first prepare datasets annotated with emotional classes and gender information. Then, we compare the performance of emotion recognition models from gender balanced samples, and also analyze gender prediction results from emotion related data. Our experiment results show that there exists gender bias in emotion detection: the models trained on the female data often achieve better results than the male models, and the female models and the male models report the opposite trends on the recognition of some emotions. We also attempt to mitigate gender bias by developing various approaches including products of experts, introducing weights and variants of focal loss, as well as adversarial training. Compared to other debiasing methods, adversarial trainings represent tpr reduction approximately 0.02–0.03 while simultaneously less harming performance by below 1.0 points on our prepared datasets. Further, we show that efficient parameters can lead to further improvements.
Original languageEnglish
Pages (from-to)422-434
Number of pages13
Early online date14 Apr 2022
Publication statusPublished - 7 Jul 2022


  • Adversarial training
  • Bias examine
  • Debiasing
  • Gender bias
  • Text emotion detection


Dive into the research topics of 'Examining and mitigating gender bias in text emotion detection task'. Together they form a unique fingerprint.

Cite this