TikTokers Must Now Opt In To Search Results That Could Contain “Graphic Or Distressing” Videos

TikTokers Must Now Opt In To Search Results That Could Contain “Graphic Or Distressing” Videos

TikTok’s director of policy in the U.S., Tara Wadhwa, outlined several new tools to promote user well-being in a corporate blog post today.


While TikTok doesn’t allow content that promotes or normalizes suicide, self-harm, or eating disorders, it has now rolled out issue-specific guides to support those who wish to share their experiences with the aforementioned issues in hopes of raising awareness.


A guide for suicide prevention will live within TikTok’s online Safety Center and was developed alongside the International Association for Suicide Prevention, the Crisis Text Line, Live For Tomorrow, and Samaritans of Singapore.

Additionally, when users searches for certain words or phrases like ‘#suicide’, TikTok will now direct them to local support resources within the app, such as the Crisis Text Line helpline. With respect to eating disorders, TikTok earlier this year rolled out comparable tools when users search for pertinent terms, and the company says it also began running public service announcements alongside popular hashtags like #WhatIEatInADay.


A new Safety Center guide on eating disorders has been developed alongside the National Eating Disorders Association, the National Eating Disorder Information Centre, the Butterfly Foundation, and Bodywhys.


Finally, TikTok will start to cover search results when users are looking for terms that could surface “graphic or distressing” content, Wadwha explains, such as ‘scary makeup’. In these instances, users must tap ‘show results’ in order to see the ensuing content. This kind of graphic content is already ineligible to appear TikTok’s main ‘For You Page‘, it says, and similar warnings already exist atop individual videos that could be distressing.