Copyright ©2017 Guacamoley. All rights reserved.
Twitter
Popular Chinese Messenger App Mistakenly Translates 'Black Foreigner' To The N-Word

If you're like most people, you've laughed at your share of cringe-worthy translations. Apps that translate are constantly leaving us with a feeling of, "Well, I guess that's close enough," but it's usually funny and a little ridiculous. 

One app has found itself in hot water for a translation that goes well beyond ridiculous and has people shouting: 

via GIPHY

WeChat is a messenger app popular in China, with almost 1 billion (yup, with a "b") active monthly users. One user, Ann James, is an American living in Shanghai. While using the app she was shocked to find a translation that was decidedly racist. If you were in a group chat with friends and someone asked "Which Jessica do you mean? We know more than one," you might respond with "the Spanish one" to identify which one you meant. Similarly, the Chinese have  a phrase, "hei laowai," which means "black foreigner."

Ann translated a message containing the phrase, but got a much less flattering translation than "black foreigner":

That's

She posted about the translation error, which was corrected inside of a day. The app apologized and blamed an error in artificial intelligence and algorithms for the translation error, but the story got more interesting when a magazine decided to do some testing. That's magazine tried the app out and found that it often translated the phrase to "black foreigner" without an issue... unless you were saying something negative. Then you got the racial slur. Also, when it translated to the racial slur, you no longer saw what service translated it, whereas most other times that was labeled clearly under the message. 

That doesn't seem purposeful at all... really... 

via GIPHY

Giphy

Check out these screen shots. 

That's

News of the flub hit social media and people had a ton to say about it:

Twitter

Ron saw an article about it on the BBC and just had to chime in:

Twitter

Anita pointed out that the phrase was already negative to begin with:

Twitter

Will seems to think it's the "black foreigner" part that's the problem and prefers a more "literal" translation:

Twitter

This person has some thoughts on why it's not so bad:

Twitter

Do you think it's possible that the issue was just an algorithm error? Does the magazine's test sway your opinion on the matter? Let us know.


H/T: That's, Mashable, Twitter