請更新您的瀏覽器

您使用的瀏覽器版本較舊,已不再受支援。建議您更新瀏覽器版本,以獲得最佳使用體驗。

Eng

Why is WeChat translating the Canadian flag emoji into “He’s in prison”?

Abacus

發布於 2019年12月18日00:12 • Karen Chiu

A glitch in the translation tool for Tencent’s messaging app generates unusual results

Emoji transcend languages, so they usually don't require translation. But WeChat will do it for you anyway if you're typing in Chinese, with some pretty bizarre results.

Tencent's popular messaging app comes with a built-in translator. So you can tap on any Chinese messages to get an instantaneous English translation. Most of the time it works fine, but it's different when you throw emoji into the mix.

While emoji by themselves don't translate into anything, when they're accompanied by a Chinese message, some emoji appear to be associated with specific words or phrases.

Flag emoji give some especially strange results:

  • Flag of Canada 🇨🇦: He's in prison
  • Flag of Bosnia & Herzegovina 🇧🇦: He's in a coma
  • Flag of Afghanistan 🇦🇫: In the middle of nowhere
  • Flag of Indonesia 🇮🇩: I'm so sorry
  • Flag of Thailand 🇹🇭: Oh, no

Some flag emoji bring up seemingly random names:

  • Flag of Vatican City 🇻🇦: Hey, Irwin?
  • Flag of Sweden 🇸🇪: Hey, Lennox.
  • Flag of Singapore 🇸🇬: Hey, Lenny!

In a statement to Abacus, WeChat said it's taking immediate action to fix the translation bug, which was first discovered by Twitter users.

"We thank users for flagging this matter and apologize for any inconvenience caused," it said.

Weird thing (bug?) in WeChat.. try it out: type in chinese and the Canadian flag emoticon, then translate it. pic.twitter.com/wRd19RXJAU

" James Hull (@jameshullx) December 17, 2019

WeChat's translation function appears to be based on machine learning. In a similar translation glitch earlier this year involving celebrity names, WeChat explained that the error arose because the system wasn't trained on certain English words. WeChat also apologized in 2017 when it was discovered that "black foreigner" in Chinese was translated as a racial slur.

With machine learning, a system learns by reading a large amount of text in one language and comparing it with the corresponding translation in another language. Since the system is trained on full sentences rather than individual words, it should ideally result in more accurate and natural translations.

But that's not always the case. Last year, users on Reddit and elsewhere discovered that when Google Translate was asked to translate a string of nonsense words and phrases, it spewed out gibberish that sounds like ominous religious prophecy. It turns out that Google Translate was trained partly on religious texts, and it resorted to them when it got confused.

Copyright (c) 2019. South China Morning Post Publishers Ltd. All rights reserved.

查看原始文章

China aims to search for exoplanets that may be habitable for humans

Abacus

European version of the new Mac Pro is 'assembled in China'

Abacus

Esports gets its own 'FIFA' in Singapore with Tencent’s backing

Abacus
查看更多
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...