top of page

Why is translating your text with AI tools not a good idea?

Artificial intelligence (AI) makes it very tempting to forgo the traditional translation process. After all, hiring a professional costs money and using AI is free. And it can produce a perfect, human-like result. It’s a no-brainer then, isn’t it?

Well, to quote Spider-Man, 'not so fast, Osborne!'

Is it me, or is someone tripping here?

Ever heard of a hallucinating machine? Well, AI is very much capable of that, which makes it quite human, right? Perhaps, but you wouldn’t want anyone to work on your project under the influence of hallucinogenic substances, would you?

According to IBM, 'AI hallucinations are when a large language model (LLM) perceives patterns or objects that are non-existent, creating nonsensical or inaccurate outputs.' This means that AI could introduce utter nonsense to your text. On the surface the output as a whole will make sense so your team might overlook that. But someone out there will spot this error and share their findings with the rest of the world. And suddenly you’re the butt of the joke!

This might seem like not a big deal, but in some contexts, it could even be life-threatening. Or at least cost you a pretty penny. And your reputation. Remember that failed Glasgow Willy Wonka Experience that was advertised with the help of AI? Or maybe you’ve seen a recent LinkedIn post from the Edinburgh-based interpreter Jonathan Downie? Seems that a spelling bee would be too much to handle for AI...

A collage showing a magical looking Willy Wonka experience advertisement and how it looked like in reality: a bare industrial-looking room with a tacky rainbow gateway and a few sad props. Jonathan's post reads: "How many ns in the word mayonnaise? Answer: The word "mayonnaise" has three "n"s. There's one "n" at the second position, another one at the fourth position, and the last one is at the seventh position."
AI expectations vs. AI reality, and Jonathan Downie's post about not-so-intelligent artificial intelligence

These Romans are crazy! – and so is AI translation

Asterix and Obelix wouldn’t make it far as international communicators, would they? I know, I know – they were being colonised so it’s only fair they weren’t too keen on Roman culture. But this is not Ancient Rome and we are not fighting off our enemies. We are attempting effective communication with international clients. And to do so, we need a big dollop of cultural awareness.

Here are a few examples: AI translating an English text into Polish or French might not know when to use an official form or address in those languages. A machine translating marketing materials for a wedding show might not understand that referencing white dresses in China is inappropriate. And I’m pretty sure if AI was translating a Polish text featuring number 2137, it would not pick up the reference to John Paul II....

Want this translated? Roll with butter!

AI may be intelligent, but it’s still artificial – hey, it’s in the name! I did a little experiment into this using ChatGPT. I found that AI struggles to analyse a text with idioms and proverbs.

Conversation with ChatGPT as follows: "Piece of Cake – Kawałek Ciasta. Question: how to translate piece of cake into Polish? Answer: The English phrase "piece of cake" translates to "kawałek ciasta" in Polish. Q: Jak przetłumaczyć na angielski "bułka z masłem"? A" "Bułka z masłem translates to "roll with butter" in English. Q: Translate this into Polish: "Running this race was a piece of cake for me." A: "Ukończenie tego biegu było dla mnie bułką z masłem." Q: Translate this into Polish: "Polish? piece of cake." Next row goes like this: "Q: Przetłumacz "Gonić w piętkę." A: "Gonić w piętkę translates to "to chase someone up" or "to chase someone's heels" in English.  It means to hurry someone or to pursue them closely. Q: Przetlumacz 'w marcu jak w garncu'. A: ""W marcu jak w garncu" można przetłumaczyć na "In March, as in a pot." Oznacza to, że w marcu panują zmienną, nieprzewidywalną pogodę, podobnie jak w garnku, gdzie składniki mogą się mieszać i zmieniać w różnych kierunkach. Q: Translate 'w marcu jak w garncu' A" "In March as in a pot." Q: Translate "kwiecien plecien bo przeplata" A: "April weaves because it intertwines.'

Although at times ChatGPT managed to translate idioms, the output was unreliable. Sometimes it provided a literal translation. Some answers featured a correct equivalent of the idiom. This is dangerous, as getting some accurate responses can give us a false sense of security. And then when something gets mistranslated or translated literally, we might overlook it.

There are boy jobs and girl jobs

I never thought I’d be quoting Theresa May on my website, but here we are... AI and machine translation are sexist and biased. A few years ago, we all talked about it in the context of Google Translate assigning masculine pronouns to sentences written in Finnish that were to do with highly professional context, and feminine ones to those describing domestic chores.

The screenshot from Google Translate shows Hungarian text in which the pronoun can be translated as both male and female. The English translation reads as follows: "She is beautiful. He is clever. he reads. She washes the dishes. He builds. She sews. He teaches. She cooks. He's researching. She is raising children. He plays music. She's a cleaner. He is a a politician. He makes a lot of money. She is baking a cake. He's a professor. She's an assistant. Go to hell, Google."
Image credit: Dora Vargha

I’ve taken ChatGPT for a spin round Polish and at first I was disappointed. My assumption was wrong: AI can provide nuanced replies when it comes to gender. Well, NOT SO FAST, OSBORNE!

Yes, it did give me both masculine and feminine forms when I asked for the translation of words ‘nurse’ and ‘doctor’. But when I used both in a sentence, the doctor instantly became male and the nurse female.

Conversation with ChatGPT. Nurse in Polish: pielęgniarka. Q: how to say 'nurse' in Polish? A: "Nurse" in Polish is "pielęgniarka" (for a female nurse) or "pielęgniarz" (for a male nurse). Q: how to say 'doctor' in Polish? A: "Doctor" in Polish is "lekarz" for a male doctor and "lekarka" for a female doctor. Q: how to say 'president' in Polish? A: "President" in Polish is "prezydent". Q: How to say "prime minister" in Polish? A: "Prime minister" in Polish is "premier". Q: How to say "beautician" in Polish? A: "Beautician" in Polish is "kosmetyczka". Q: Translate this into Polish: "Both presidents and their spouses entered the building". A: In Polish, it would be: "Obaj prezydenci oraz ich małżonkowie weszli do budynku." Next row shows this conversation: Q: translate this into Polish: "The doctor asked the nurse to pass the knife." A: In Polish, it would be: "Lekarz poprosił pielęgniarkę, aby podała mu nóż."

But there's more. According to ChatGPT, the terms 'president' and 'prime minister' don’t have feminine versions. And the gender-neutral 'spouse' is translated into a masculine form. As an inclusive translator, I do not like this one bit!

You wouldn’t steal a car

Remember those anti-theft DVD adverts (which, interestingly, were viewed by people who did obtain the said DVD in a legal manner)? These were used to spread awareness of copyright and intellectual property. But did you know that by using AI you’re extremely likely to infringe those laws?

Black background with white text that reads: "you wouldn't steal a car"

AI is being trained by the input inserted by users and by materials available online. But the fact that a text or part of it has been published online doesn’t mean that anyone can just reproduce it in their own work. There’s even a fancy word for it: stealing.

You’re most likely not even aware that your latest super-creative-fancy-as-hell slogan that AI helped you come up with (let’s be real, it did the heavy lifting) had been actually used by competition years ago. And is subject to copyright. And you’re breaking the law. And… you get the point.

Hold my drink (with a paper straw)!

AI may be free to use, but it costs the Earth. According to MIT Technology Review, training a single AI model can emit as much carbon as five cars in their lifetimes. Isn’t it ironic how companies constantly compete against each other in the ‘sustainability Olympics’ but will turn to AI without considering the planet? Think about that next time you’re proudly slurping your vegan milkshake through a paper straw.

But sustainability is not the only thing to worry about when it comes to AI. Human rights are also to consider. According to Amnesty International, 'From predictive policing tools, to automated systems used in public sector decision-making to determine who can access healthcare and social assistance, to monitoring the movement of migrants and refugees, AI has flagrantly and consistently undermined the human rights of the most marginalised in society. Other forms of AI, such as fraud detection algorithms, have also disproportionately impacted ethnic minorities, who have endured devastating financial problems as Amnesty International has already documented, while facial recognition technology has been used by the police and security forces to target racialised communities and entrench Israel’s system of apartheid.'

Ugh, stop being so dramatic, Alicja!

I understand that technological progress is impossible to halt, and I’m not proposing to completely stop using AI. But we need to be aware of the fact that AI won’t solve every issue and can in fact bring us new ones, too. Especially since even people who develop AI don’t know how it really works.

Next time you’re considering giving AI the job you’d usually

hire a translator to do, opt for a qualified human.

That way you can be sure that:

idioms and cultural references are rendered properly,

the finished product is free of unnecessary biases or AI hallucinations,

and that you won’t get sued for breaching copyright.

And it’s a much more environmentally friendly way of working!


bottom of page