TikTok’s addictive algorithm means that the endless scroll of the FYP can get both too specific and too sinister way too fast. Given the challenging relationship to food and diet that lockdown fostered for many, the app has seen a huge uptick in videos that at first promoted healthy eating or at-home workouts, but quickly became a home for triggering content that promotes disordered eating and, frequently, damaging starvation diets.
Now, the video platform is finally announcing a formal investigation into these accounts and clips, following a report by The Guardian that showed how prolific the content had become. In the report, The Guardian highlighted accounts that promoted low calorie “safe” foods to bulimic users, as well as those that provided a list of “tips” for unhealthy weight loss.
As a result, TikTok banned six accounts that had violated the platform’s “community guidelines”, which prohibits posting content that promotes disordered eating. However at present, unlike other platforms like Tumblr and Instagram, searching for terms or content that relates to eating disorders does not automatically refer the user to mental health charities where they can access help.
TikTok, it should be noted, had taken some steps to address the app’s ongoing problem with pro-anorexia videos. Earlier this year they banned advertisements of weight-loss products, as well as blocking hashtags that were traditionally associated with eating disorder content. However, as The Guardian points out, it was still relatively easy to bypass the restrictions with deliberate misspellings of the offending terms, a practice already common on TikTok for words related to suicide, sex or violence.
“As soon as this issue was brought to our attention, we took action banning the accounts and removing the content that violated those guidelines, as well as banning particular search terms,” a spokesperson for TikTok told The Guardian. “As content changes, we continue to work with expert partners, update our technology and review our processes to ensure we can respond to emerging and new harmful activities.”
For a platform that’s grown as exponentially as TikTok — this year it reached the milestone of half a billion users, just four years after its launch in September 2016 — and which relies on users to generate their own content, it’s perhaps inevitable that problems like this will arise, but it’s also long overdue that the app itself has taken responsibility in moderating content that potentially endangers the health of its overwhelmingly young user base.
Whether the investigation it has now begun will finally undo the damage of how prolific pro-anorexia content has become over the past year on the app remains to be seen, but it’s at least a step in the right direction.