TikTok is warning users to be on the lookout for videos of a man killing himself that are spreading on the social media platform.
The suicide video has been circulating on the app since at least Sunday night, TikTok spokesperson Hilary McQuaide told BuzzFeed News.
"Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide," she said.
"We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who've reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family."
The video is from a Facebook Live recording that a Mississippi man made last week of him killing himself.
Trolls are also inserting sections from the video into other seemingly harmless clips in an effort to trick people into watching it.
Some users on TikTok have been filming videos warning others of the footage by showing them a screenshot (a bearded man sitting at a desk) to know what to be on the lookout for.
Unlike other apps where users must subscribe to or befriend others to see their content, TikTok users frequently encounter videos from people they do not follow via their For You pages.
The efforts by TikTok to remove the video were first reported Sunday by the Verge.
This is by no means the first suicide to be aired on Facebook. In 2017, BuzzFeed News found at least 45 instances of violence — suicides, shootings, murders, torture, and child abuse — that were streamed via Facebook Live since it first launched in December 2015.
Facebook now uses artificial intelligence to identify posts from users indicating thoughts of suicide or self-harm.
In the past, websites like Reddit have also come under fire for not acting quickly enough to remove videos of suicide or other violent acts.