Media

TikTok ‘tried to filter out videos from ugly, poor or disabled users’ | Technology


TikTok moderators were told to suppress videos from users who appeared too ugly, poor or disabled, as part of the company’s efforts to curate an aspirational air in the videos it promotes, according to new documents published by the Intercept.

The documents detail how moderators for the social video app were instructed to select content for the influential “For You” feed, an algorithmic timeline that is most users’ first port of call when they open the app. As a result, being selected for For You can drive huge numbers of views to a given video, but the selection criteria have always remained a secret, with little understanding as to the amount of automation involved.

TikTok’s moderators were instructed to exclude videos from the For You feed if they failed on any one of a number of categories, the documents show. Users with an “abnormal body shape (not limited to: dwarf, acromegaly),” who are “chubby … obese or too thin” or who have “ugly facial looks or facial deformities” should be removed, one document says, since “if the character’s appearance is not good, the video will be much less attractive, not worthing [sic] to be recommended to new users.”

Similarly, the documents show, videos were to be removed from the feed if “the shooting environment is shabby and dilapidated”, since “this kind of environment is … less fancy and appealing”.

A TikTok spokesperson said the goal was to prevent bullying on the platform, tying the document to a report from December that showed that the company was suppressing vulnerable users’ videos in a misguided effort to prevent them from becoming the centre of attention that could turn sour. The categories of video suppressed in the latest document are far broader than those revealed in December, however, nor is any mention of bullying, a discrepancy the company attributes to a local interpretation of the wider policy.

Other documents published by the Intercept show the extent of TikTok’s former rules requiring moderators to enforce Chinese foreign policy overseas. The site published the company’s livestreaming policies, which instruct moderators to take down “controversial content” that promotes Taiwanese independence or is “uglification” of history, such as the Tiananmen Square “incidents”.

The language is identical to that used in documents first reported by the Guardian in September 2019. At the time, TikTok said the documents were old, and had been out of use since May that year, but the Intercept cites a source who indicated that the policies “were in use through at least late 2019”.

In a statement, TikTok said:

“The livestream guidelines in question appear to be largely the same or similar to the guidelines the Guardian already reported on last year, which were removed both before the Guardian’s reporting and also prior to when the Intercept says the document was accessed. Over the past year, we have established trust and safety hubs in California, Dublin and Singapore, which oversee development and execution of our moderation policies and are headed by industry experts with extensive experience in these areas. Local teams apply the updated community guidelines that we published in January, all aimed at keeping TikTok a place of open self-expression and a safe environment for users and creators alike.

“Most of the guidelines the Intercept presented are either no longer in use, or in some cases appear to have never been in place, but it is correct that for live streaming TikTok is particularly vigilant about keeping sexualised content off the platform.”

TikTok, which is owned by Beijing-based tech unicorn Bytedance, has been trying to separate its international efforts from its Chinese home for at least a year. But many staff, including moderators, are still based in China.

According to the Wall Street Journal, the last of those China-based moderators will be shifted to other work inside the company shortly, with locally-based moderators picking up the slack. TikTok had already succeeded in moderating all US content from outside China, but relied on China-based moderators for many other countries, in particular to provide 24-hour cover for much of Europe.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.