Skip to content

Tag: bias

Update: Google Translate and Gender Bias

For my shorter blog essay, I wrote a post called Machine Translation and Gender Bias. In it, I discussed the evidence of gender bias on Google Translate. On December 6, 2018, an article that is highly relevant to this blog post was published on Google Blog. I wanted to write another blog post to discuss its content and implications. 

The article, titled ‘Reducing gender bias in Google Translate’ was written by James Kuczmarski, Product Manager at Google Translate. It details a new development in the service that aims to “[address] gender bias by providing feminine and masculine translations for some gender-neutral words”. When translating a single word such as ‘surgeon’ from English into French, Italian, Portuguese or Spanish, or when translating phrases from Turkish into English, the article claims that both the feminine and masculine translation will be displayed. While the German example in my original blog post is, thus, still not covered by this new feature, Kuczmarski states that Google Translate plan to cover more languages in the future. (Source: Reducing gender bias in Google Translate)

I decided to test the new feature out in another language that I study, Spanish. Firstly, I entered the term ‘secretary’: 

Google Translate did indeed provide the masculine and feminine form of the noun in Spanish. However, when I added ‘the’ before the noun, it was clear that the service has not fully solved the gender bias issue:

When the definite article is placed before the noun, the translation still automatically becomes ‘la secretaria’, the feminine form. 

It is great to see that efforts are being made to reduce the amount of gender bias on Google Translate. However, the limited number of languages covered by the new system and the fact that a quick test that I carried out showed an obvious flaw in it (that it does not function with the definite article), it is clear that there is still much progress to be made. 

List of Sources:

Leave a Comment

WL4102 Shorter Blog Essay: Machine Translation and Gender Bias

For my shorter blog essay, I decided to look at bias in machine translation with a particular focus on Google Translate and gender. How does Google Translate work? Where can we see gender bias and why does this occur? How does this link to linguistic corpora? These are some questions upon which I hope to reflect in this blog post.

Google Translate was launched in 2006 with the goal of “[breaking] language barriers and [making] the world more accessible”. With over 500 million users and over 100 languages available, the service was seeing more than 100 billion words being translated per day by 2016. These extremely high numbers show what a fundamental part Google Translate plays in translation for many people around the world. The service is an example of Machine Translation (MT) or translation from one natural language to another using a computer. (Source: Ten years of Google Translate)

Google Translate was originally based on statistical machine translation. Phrases were first translated into English and then cross-referenced with databases of documents from the United Nations and the European Parliament. These databases are corpora, e.g. ‘European Parliament Proceedings Parallel Corpus 1996-2011. Although the translations produced were not flawless, the broad meaning could be reached. In November 2016, it was announced that the service would change to neural machine translation, which would mean comparing whole sentences at a time and the use of more linguistic resources. The aim was to ensure greater accuracy as more context would be available. The translations produced through this service are compared repeatedly, which allows Google Translate to decipher patterns between words in different languages. (Source: Google Translate: How does the search giant’s multilingual interpreter actually work?)

Although accuracy has improved, one issue that remains is that of gender. Machine Translation: Analyzing Gender, a case study published in 2013 as part of Gender Innovations at Stanford University, shows that translations between a language with fewer gender inflections (such as English) and a language with more gender inflections (such as German) tend to display a male default, meaning the nouns are shown in the male form and male pronouns are used, even when the text specifically refers to a female. The case study also shows, however, that this male default is overridden when the noun refers to something considered stereotypically female, such as ‘nurse’. I tested gender bias myself on Google Translate by translating the grammatically gender-neutral term ‘the secretary’ from English to German. As can be seen in the photo below, the automatic translation feature shows that the translation of ‘the secretary’ by itself translates automatically to ‘die Sekretärin’ (the secretary, female) but when it is combined with ‘of state’ translates automatically to ‘der Staatsekratär’ (secretary of state, male). This shows a very obvious bias in terms of gender roles. (Sources: Machine Translation | Gendered InnovationsGoogle Translate’s Gender Problem (And Bing Translate’s, And Systran’s)

The above example shows a simple word with the definite article and gives no other context. Therefore, the machine had to rely on frequency and chose the gender most used in translations of this word in the corpora upon which it is based. In the corpora, ‘the secretary’ is more frequently shown to be a female and ‘the secretary of state’ is more frequently shown to be a male. Although Google Translate has moved away from solely statistical translation to neural machine translation, it still displays issues stemming from statistical methods. This 2017 article by Mashable features further examples. In it, a Google spokesperson is quoted as saying over email, “Translate works by learning patterns from many millions of examples of translations seen out on the web. Unfortunately, some of those patterns can lead to translations we’re not happy with. We’re actively researching how to mitigate these effects; these are unsolved problems in computer science, and ones we’re working hard to address.” (Sources: Machine Translation | Gendered InnovationsGoogle Translate might have a gender problem)

The current algorithms involved in Google Translate, which involve using available corpora to translate phrases, present this issue. As corpora show language as it is used, one would hope that the gender balance within them would improve as the world develops more awareness of gender-related issues and gets closer to achieving gender balance. This could then, in turn, have an effect on the translations. The Machine Translation: Analyzing Gender case study suggests a solution that would entail reforming the computer science and engineering curriculum to include “sophisticated methods of sex and gender analysis”, and concludes with a reflection on the complexity of this issue and on the need for new algorithms, understandings and tools. Until then, this issue, as shown by my above quick search, is ongoing. (Source: Machine Translation | Gendered Innovations)

For more content relating to corpora and gender, please see my main blog essay.

Update – December 8, 2018: I have written another blog post relating to this short essay in light of an article published two days ago on Google Blog. You can read this post here.

List of sources:

Leave a Comment
css.php