TikTok allegedly asked its moderators to censor content from people deemed “too ugly, poor or disabled” as part of the platform’s efforts to attract new users, a report by The Intercept stated.
The Intercept, an online news publication, released new documents highlighting the Chinese video sharing platform’s livestream guidelines that were given to the app's moderators.
The documents state how moderators would choose content that would appear on the ‘For You’ feed, the page users generally see when they first log in on the app.
TikTokers usually work to appear on this page as it can garner a large number of views to their videos. However, recently the documents published by The Intercept has gotten the platform criticism for their selection criteria.
Previously, the selection strategy was a secret, with little known about the amount of moderation and automation involved.
Thereport stated: “Moderators were explicitly told to suppress uploads from users with flaws both congenital and inevitable. ‘Abnormal body shape,’ ‘ugly facial looks,’ dwarfism, and ‘obvious beer belly,’ ‘too many wrinkles,’ ‘eye disorders,’ and many other ‘low quality’ traits are all enough to keep uploads out of the algorithmic fire hose.”
The instructions were not limited to people featured in the videos but guidelines about the places at which the clips were shot at were also sent out to moderators: “Videos in which ‘the shooting environment is shabby and dilapidated,’ including but ‘not limited to … slums, rural fields’ and ‘dilapidated housing’ were also systematically hidden from new users, though ‘rural beautiful natural scenery could be exempted,’ the document notes.
TikTok issued a statement to Gulf News in response: "The livestream guidelines in question have been removed. Over the past year, we have established Trust and Safety hubs in California, Dublin and Singapore, which oversee development and execution of our moderation policies and are headed by industry experts with extensive experience in these areas. Local teams apply the Community Guidelines that we published in January, all aimed at keeping TikTok a place of open self-expression and a safe environment for users and creators alike. The other policies mentioned represented an early blunt attempt at preventing bullying. We recognize that this was not the correct approach and have ended it. In January, we rolled out our new Community Guidelines to provide greater clarity, and last week we announced a transparency centre for our moderation practices.”