This Survey Shows Americans Can't Agree On What Exactly "News" Is On Facebook
A new survey from BuzzFeed News and Ipsos Public Affairs also found that 54% of American adults trust news they see on Facebook “only a little” or “not at all."
Facebook is an increasingly important source of news for American adults, but they can’t seem to agree on what exactly qualifies as “news” on the platform, and many remain skeptical of it as a source of trustworthy information, according to a new survey from BuzzFeed News and Ipsos Public Affairs.
The findings, based on a survey of nearly 3,000 American adults between March 23 and 28, suggest that news on Facebook is an area rife with confusion and contradictions for users. When combined with other recent data, it also highlights the differences between what people say and what they actually do when it comes to consuming and trusting news on Facebook.
Overall, 48% of respondents said Facebook was a major or minor source of news for them. Another 20% said it "rarely" was. The rest either said Facebook was never a source or they weren't familiar with the platform. The survey found that more than half of those who use Facebook as a news source — 54% — said they trust news on the platform “only a little” or “not at all."
View a summary of the results here.
What content do people say is "news" on Facebook?
The survey asked those who do use Facebook as a source of news to identify which types of content on Facebook they “consider news.” Seventy percent said they considered “content from traditional media sources (i.e. CNN, New York Times, etc.) shared on their pages” to be news — the highest percentage of any option. This result means a third of those surveyed don’t consider news from actual news organizations to be news when it appears on that outlet’s Facebook page.
A total of 51% of respondents said that content from a traditional outlet shared by one of their friends was news.
“When asked what they consider to be ‘news’ on Facebook, most people focus on traditional outlets like CNN or the New York Times,” said Ipsos researcher Chris Jackson. “However, there are some clear differences in perception when it comes to stories published on Facebook by traditional media and traditional news stories shared by friends.”
Just 31% of people said “content from non-traditional news sources, (i.e BuzzFeed, VICE, Occupy Democrats, Breitbart etc.)” shared on the outlet’s own Facebook page is news, and 26% said “Status updates from my Facebook friends” is news. Non-traditional news outlets put significant effort into spreading their content on Facebook, yet the vast majority of American adults surveyed don’t consider it to be news when it appears on that platform.
The bottom line is that while almost half of the American adults surveyed said Facebook is a major or minor source of news for them, there is far from any unanimity as to what “news” actually is when it comes to the platform.
This question also saw a divergence in the responses from Republicans and Democrats. Only 61% of Republicans said they consider content from traditional sources shared on the outlet’s Facebook page to be news. That compared to 77% of Democrats.
Deciding what news to trust on Facebook
Respondents were also asked to indicate how important various factors are when determining the trustworthiness of news on Facebook. Eighty-three percent said the news source was very or somewhat important, the highest response. That compared to 71% who said that their familiarity with the specific news story was very or somewhat important, and 63% who placed that degree of importance on the person who shared it.
It’s important to view these responses in context given the findings of a recent study from the Media Insight Project. That study created an experiment to test user behavior and trust on Facebook and found that the sharer of a given story mattered more than the news source.
“Whether readers trust the sharer, indeed, matters more than who produces the article — or even whether the article is produced by a real news organization or a fictional one,” according to the study.
Tom Rosenstiel, executive director of the American Press Institute, which helped run the experiment, says Facebook users say and do two different things when it comes to evaluating the trustworthiness of news on the platform.
“People often say what they think they believe in surveys, or what is socially responsible,” he told BuzzFeed News. “Experiments test real behavior if done right. So our experiment found people were deluding themselves.”
Why people don’t read or trust news on Facebook
The survey also sought to understand why some people don’t use Facebook as a news source. Of the 1,377 respondents who said they rarely or never used the platform for news, 41% said they “mostly use Facebook to keep up with friends and family, the top response. The next most popular answer, at 33%, was “I prefer other news sources.” A third of respondents said they “don’t trust news on Facebook.” This suggests many people still consider Facebook to be more suited for personal connections and communications than news, while others have trust issues with the information they get on the platform.
The survey also asked those who trust news on Facebook “only a little” or “not at all” to indicate why. Two-thirds said one reason was that “anyone can post content that looks like news on Facebook” — the most popular response. Forty-four percent said they don’t trust news on social media in general.
And in a sign of how much concerns about misinformation on Facebook resonate with American adults, 42% selected “Facebook doesn’t do a good job of removing fake news.”
The survey also revealed a notable gap between Democrats and Republicans when it came to concerns about censorship on Facebook. Twenty percent of Republicans said they don’t trust news on the platform because “Facebook censors some news.” But only 8% of Democrats selected that option.
This could be a result of the scandal from last year when a former curator for Facebook’s Trending product claimed that conservative news and sources had been suppressed from the Trending list. The allegation was never fully proven, but it subjected Facebook to significant backlash from conservative media leaders.
Respondents were also asked if they mistrust news on Facebook because of the role an algorithm plays in choosing content, or because Facebook does not have human editors. Neither appeared to be a significant concern. Only 15% of respondents don’t trust news on Facebook due to the role of an algorithm, and even fewer, 11%, have concerns about the lack of human editors.