قالب وردپرس درنا توس
Home / Trends / Google Translate has been updated to reduce gender bias in its translations

Google Translate has been updated to reduce gender bias in its translations



Google has long had issues with gender equality, from hiring too many white men, to debates about whether women's attitudes are underpaid compared to men. With the issue of sex discrimination, Google gained considerable publicity this year. Google endeavored to include more sexes through measures such as the change-the-game initiative to encourage young women to engage with game design, taking another step towards reducing gender prejudice by changing the way Google Translate translates. Because Translate learns from content already on the Web, it tends to reproduce gender-based assumptions in the language. This is a problem in many languages, such as Spanish, Portuguese, French and Italian, that have nouns that can be either male or female. When Google translates into one of these languages, it tends to use gender-based assumptions based on stereotypes.

For example, if you translate a word like "strong" or "doctor," Translate tends to a male interpretation of the word. Conversely, translating a word like "sister" or "beautiful" would lead to a female interpretation. Translate now offers both a male and a female version of the term as a translation.

  google translate reduces the gender-specific translation
Gender-specific translations on the Google Translate website. Google

So, if you typed "o bir doctor" into Translate earlier, it would translate from Turkish to "He's a doctor". Now you will be presented with two options instead: "She is a doctor". This feature is currently only available in some languages ​​and in the web version of Translate. However, Google plans to roll it out to other languages ​​and to other versions of the software, such as the iOS and Android apps. Google also mentions plans to address the non-binary gender in translations at a later date.

Equality in relation to gender and race has been particularly difficult in situations where machine learning is used, as these systems are trained on existing contents of creators, who often themselves are not demographically representative. This has led to facial recognition software that is less accurate, for example, when working with non-Caucasian faces and automatically fills out search results that were derogatory to black, Asian, and Latin American women. Hopefully, this change, like Translate works with Gender, is a step towards reducing this tendency.






Source link