If you're like most people, you've laughed at your share of cringe-worthy translations. Apps that translate are constantly leaving us with a feeling of, "Well, I guess that's close enough," but it's usually funny and a little ridiculous.
WeChat is a messenger app popular in China, with almost 1 billion (yup, with a "b") active monthly users. One user, Ann James, is an American living in Shanghai. While using the app she was shocked to find a translation that was decidedly racist. If you were in a group chat with friends and someone asked "Which Jessica do you mean? We know more than one," you might respond with "the Spanish one" to identify which one you meant. Similarly, the Chinese have a phrase, "hei laowai," which means "black foreigner."
She posted about the translation error, which was corrected inside of a day. The app apologized and blamed an error in artificial intelligence and algorithms for the translation error, but the story got more interesting when a magazine decided to do some testing. That's magazine tried the app out and found that it often translated the phrase to "black foreigner" without an issue... unless you were saying something negative. Then you got the racial slur. Also, when it translated to the racial slur, you no longer saw what service translated it, whereas most other times that was labeled clearly under the message.
Fine example of the state of home grown AI in China, no doubt.— Wai Sing-Rin (@waisingrin) October 11, 2017
BBC=SJW— Ron Nicholas 🇺🇸 (@realronnicholas) October 13, 2017
If I choose to say nigger I shouldn't get arrested for it
The term 老外 lǎowài can already have a negative connotation when used to refer to foreigners.— Anita Huang (@HLaoshi) October 11, 2017
Don't see what this is wrong since it's just a literal translate.— Will Zhang (@esvhd) October 12, 2017
WfF! China, you too? Its not as offensive just b/c they had nothing to do with slavery.. but still offensive. But... is "chink" fair now?— Fly_N_Fancy (@CuteYvette) October 13, 2017