YouTube, Facebook, Twitter, Instagram, Periscope, and their peers have built communications tools that have empowered everyday people to broadcast to millions of others, all around the world, in an instant. Yet that very ability comes with extreme consequences when evil people decide to harness those same tools to increase the notoriety they seek for their own horrible actions. That happened today, in dramatic fashion.
Not long after gunning down a TV reporter and cameraman in Virginia this morning, the suspected shooter, Vester Lee Flanagan II, took to Twitter to promote his first-person video of the murders: "I filmed the shooting see Facebook," he tweeted. Moments later, he posted the videos on Twitter, too.
It didn't take long for both Twitter and Facebook to remove the videos and suspend Flanagan's accounts. But, by the time they had, many people had reported seeing the videos, unwittingly, as they spread through the platforms via their respective sharing mechanisms.
A fact of life on the internet is that content will spread no matter where it's posted. The videos of this shooting, if posted on a random website, or even mailed to a news organization, would likely have been viewed thousands of times — even if they were kept off social media completely.
But autoplay video, a relatively new feature for both Twitter and Facebook, turned both social platforms into attractive conduits for a person looking to force others to witness his acts of violence. Today's events put both companies in awkward positions, raising questions about how they'll treat a feature that makes their products more dynamic and is loved by the advertisers who pay their bills.
A Twitter spokesperson declined to comment but pointed BuzzFeed News to the company's media policy, which says video marked as "sensitive" does not autoplay. Facebook too declined to say more than this: "We have removed a profile and a Page for violating our Community Standards."
Though it's difficult to draw lines through the behavior of individuals who carry out these attacks, it's established that many killers study the actions of those who came before them and, in many cases, emulate them. Today's suspected killer was effective in spreading the video of his violent act — and, indeed, forcing others to experience the carnage — through social channels, making them a distribution vehicle more attractive to the inevitable killers that will come next.
Some will ask whether the social platforms are doing enough by waiting for people to flag sensitive content before it is removed, but there's likely no way for them to act more quickly as long as they keep autoplay. And since the prospect of autoplay going away is minimal — and there are good arguments to keep it around — we're likely going to be left with a situation where depraved people will exploit the feature in ways similar to what we saw today.