Facebook Is Testing Auto-Responses For Live Video And Some Of The Suggestions Are Offending People

"So I’m just noticing that Facebook has a thoughts and prayers autoresponder on our Chicago Hospital shooting livestream and I have thoughts."

Facebook appears to be testing a new tool that prompts users to comment on live video streams — including those involving sensitive situations like shootings and sexual assault — using suggested text and emojis.

On Monday, a handful of Facebook users noticed that the social media platform was offering them preset responses for live videos about a series of news stories. On one stream for MSNBC about an ongoing, officer-involved shooting at a Chicago hospital, NBCUniversal contractor Stephanie Haberman noticed Facebook was prompting her to comment with phrases like “this is so sad” and “so sorry,” along with emojis including the prayer hands.

“So I’m just noticing that Facebook has a thoughts and prayers autoresponder on our Chicago Hospital shooting livestream and I have thoughts,” Haberman tweeted along with photos of the suggested responses from Facebook. She declined to comment for this story.

While autoreply prompts are not an entirely new concept for Silicon Valley products — Google’s Gmail recently unveiled a pre-populated response tool called “Smart Reply” and Instagram sometimes suggests emoji responses — this appears to be the first time Facebook has tested the tool on live video, where content can be sensitive, unpredictable, and sometimes depicts violence. And while Facebook seems to be aiming to improve engagement on live video, critics have called the prompts insensitive and further evidence that the company has not thought out the human impact or consequences of its products.

So I’m just noticing that Facebook has a thoughts and prayers autoresponder on our Chicago Hospital shooting livestream and I have thoughts https://t.co/8LQULnbQty

Following publication of this story, a Facebook spokesperson confirmed the company had been testing a suggest comment feature on live videos. "Clearly this wasn’t implemented properly and we have disabled this feature for now,” the spokesperson said in a Tuesday email to BuzzFeed News.

BuzzFeed News examined other Facebook livestreams on Monday and found that the social media platform was testing prompted responses on a variety of videos, including ones from local news outlets, the shopping network QVC, and gamers. On one video from Phoenix’s Fox 10 station about a sexual assault and possible shooting in a Catholic supply store, Facebook’s algorithm suggested that the user comment with “respect” or “take care.” On a different stream about the Chicago hospital shooting from NBC News, the suggested responses included a crying tears of joy emoji and another making a kissing face.

In testing the product, a BuzzFeed News reporter only had to click a suggested response once for it to appear in the comment feed of a given live video. Once one response was selected, the prompted comment menu disappeared as an option. It’s unclear when Facebook rolled out the prompted response tool on live videos, or how widely available it is.

“Facebook has bigger things to worry about right now than rolling out response prompts on live video,” Caroline Sinders, principal designer at Convocation Design+Research, told BuzzFeed News. “And given that it’s suggesting inappropriate responses, I would say it’s probably best to turn it off now or allow users to turn it off.”

Sinders, a former fellow at the BuzzFeed Open Lab, explained that offering autoreplies to live video can be especially hard given that current machine learning technology has a hard time “sussing out context in video and audio, just as it does with text.” For example, since it debuted live video in December 2015, Facebook has struggled with using algorithms to filter our violent content from users’ feeds.

Facebook’s prompted reply tool also appeared on live video for the QVC shopping channel; it suggested that users comment “pretty” and “cute” as two hosts showed off a dress. On a livestream of a gamer playing Battlefield V, the feature suggested that viewers greet others with “yo” and “hey again!”

The most frequent blunders, however, seemed to happen on live news segments. During an ABC7 stream of a police pursuit in Los Angeles, Facebook’s algorithm suggested some questionable responses to users, including “Go” and “Agree.”

A source close to NBCUniversal said the company had never seen the prompts before and that its news outlets had not opted in to them.

“This wins for most dystopian thing I've seen all day (and I live in the smoke-drenched Bay Area where everyone is wearing masks, so that's saying a lot),” one person tweeted in response to seeing screenshots of prompted responses on Twitter.

UPDATE

This story has been updated with comment from Facebook.


Topics in this article

Skip to footer