TikTok on Tuesday unveiled a slew of changes to its community guidelines that it says are meant to promote “safety, security, and well-being” on the popular social video app.
Among the changes, TikTok is taking a stricter approach to dangerous acts and challenges such as suicide hoaxes. Its policy around such issues, first outlined in November, will now be highlighted in a separate category of the community guidelines to make it easier for people to find. It previously fell within TikTok’s suicide and self-harm polices. TikTok is also launching videos that encourage people to “stop, think, decide and act” when they see risky online challenges.
Get the CNET Daily News newsletter
Catch up on the biggest news stories in minutes. Delivered on weekdays.
TikTok is also explicitly banning deadnaming, misgendering and misogyny in the update to its community guidelines. Deadnaming is when someone, intentionally or not, uses a transgender person’s former name without that individual’s consent. GLAAD calls it “an invasion of privacy that undermines the trans person’s true authentic identity, and can put them at risk for discrimination, even violence.”
TikTok will also ban content that supports or promotes the scientifically discredited practice of so-called conversion therapy, when therapy is used to try to “convert” LGBTQ people. Major medical and mental health organizations including the American Medical Association and the American Psychological Association have denounced the practice, and a number of cities and states have laws designed to protect people from it.
“Though these ideologies have long been prohibited on TikTok, we’ve heard from creators and civil society organizations that it’s important to be explicit in our Community Guidelines,” Cormac Keenan, TikTok’s head of trust and safety, wrote in a blog post.
The updated community guidelines also broaden TikTok’s ban on content that promotes eating disorders, and they expand policies focused on the security and integrity of the app.
TikTok and other social media platforms have for years faced criticism for harboring harmful content and fostering anxiety and depression, particularly among teens and younger audiences. Last year TikTok, alongside YouTube and Snap, faced scrutiny from lawmakers during a Senate hearing focused on keeping kids safe online. In December, TikTok said it was making changes to how it recommends videos on the app, in an attempt to make sure it doesn’t inadvertently reinforce negative experiences.
TikTok said the guideline changes will be implemented over the next few weeks.
If you’re struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 1-800-273-8255; in the UK, call the Samaritans at 116 123; and in Australia, call Lifeline at 13 11 14. Additionally, you can find help at these 13 suicide and crisis intervention hotlines.