Gender and bias in Amazon review translations: by humans, MT systems and ChatGPT

Maja Popovic, Ekaterina Lapshinova-Koltunski


Abstract
This paper presents an analysis of first-person gender in five different translation variants of Amazon product reviews:those produced by professional translators, by translation students, with different machine translation (MT) systems andwith ChatGPT. The analysis revealed that the majority of the reviews were translated into the masculine first-person gender, both by humans as well as by machines. Further inspection revealed that the choice of the gender in a translation is not related to the actual gender of the translator. Finally, the analysis of different products showed that there are certain bias tendencies, because the distribution of genders notably differ for different products.
Anthology ID:
2024.gitt-1.3
Volume:
Proceedings of the 2nd International Workshop on Gender-Inclusive Translation Technologies
Month:
June
Year:
2024
Address:
Sheffield, United Kingdom
Editors:
Beatrice Savoldi, Janiça Hackenbuchner, Luisa Bentivogli, Joke Daems, Eva Vanmassenhove, Jasmijn Bastings
Venues:
GITT | WS
SIG:
Publisher:
European Association for Machine Translation (EAMT)
Note:
Pages:
22–30
Language:
URL:
https://aclanthology.org/2024.gitt-1.3
DOI:
Bibkey:
Cite (ACL):
Maja Popovic and Ekaterina Lapshinova-Koltunski. 2024. Gender and bias in Amazon review translations: by humans, MT systems and ChatGPT. In Proceedings of the 2nd International Workshop on Gender-Inclusive Translation Technologies, pages 22–30, Sheffield, United Kingdom. European Association for Machine Translation (EAMT).
Cite (Informal):
Gender and bias in Amazon review translations: by humans, MT systems and ChatGPT (Popovic & Lapshinova-Koltunski, GITT-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.gitt-1.3.pdf