WhatsApp Fueled A Global Misinformation Crisis. Now, It’s Stuck In One.

A new privacy policy will be delayed three months after people were confused about what it would mean.

Hours after WhatsApp announced a new privacy policy to the nearly 2 billion people around the world who use it, the rumors flew fast and thick.

“Don’t accept WhatsApp’s new policy,” said one of the messages that went viral on the platform. “Once you do, your WhatsApp account will be linked to your Facebook account and Zuckerberg can see all your chats.”

“In a few months, WhatsApp will launch a new version that will show you ads based on your chats,” said another one. “Don’t accept the new policy!”

Thousands of similar messages went viral on WhatsApp, the instant messaging app owned by Facebook, in the days that followed. Egged on by celebrities like Tesla CEO Elon Musk and whistleblower Edward Snowden, millions of people rushed to download WhatsApp alternatives like Signal and Telegram.

There was just one problem: From the 4,000-word policy, it was clear that the new changes applied only if people used WhatsApp to chat with businesses, not private conversations with friends and family.

No, the new terms would not let Facebook read your WhatsApp chats, the company explained to anyone who asked. Top executives posted long threads to Twitter and gave interviews to large publications in India, the company’s largest market. WhatsApp spent millions buying front-page ads in major newspapers and released graphics debunking the rumors on its website with a large “Share to WhatsApp” button, hoping to inject some truth into the stream of misinformation coursing through its platform. The company also encouraged Facebook employees to share these infographics, according to posts to its internal message board Workplace.

"There has been a great deal of misinformation and confusion so we’re working to provide accurate information about how WhatsApp protects people’s personal conversations," a WhatsApp spokesperson told BuzzFeed News. "We’re using our Status feature to communicate directly with people in WhatsApp, as well as posting accurate information to social media and our website in dozens of languages. Of course we’ve also made these resources available to people who work at our company so they can answer questions directly to friends and family if they wish."

None of it worked.

“There’s been a lot of misinformation causing concern and we want to help everyone understand our principles and the facts,” WhatsApp wrote in a blog post last week announcing that the company would delay the new privacy policy by three months. “We’re also going to do a lot more to clear up the misinformation around how privacy and security works on WhatsApp,” it wrote.

Thank you to everyone who’s reached out. We're still working to counter any confusion by communicating directly with @WhatsApp users. No one will have their account suspended or deleted on Feb 8 and we’ll be moving back our business plans until after May - https://t.co/H3DeSS0QfO

Twitter

For years, rumors and hoaxes spreading through WhatsApp have fueled a misinformation crisis in some of the world’s most populous countries like Brazil and India where the app is the primary way most people talk with each other. Now, that crisis has reached the company itself.

“Trust in platforms is [at a] rock bottom,” Claire Wardle, cofounder and director of First Draft, a nonprofit organization that researches misinformation, told BuzzFeed News. “We’ve had years of people becoming increasingly concerned about the power of technology companies, particularly an awareness of how much data they are collecting on us. So when privacy policies are changed, people are rightly concerned about what that means.”

Wardle said people are concerned that WhatsApp would connect their behavior on the app with the data from their Facebook accounts.

“Facebook and WhatsApp have a huge trust deficit,” said Pratik Sinha, founder of Alt News, a fact-checking platform in India. “Once you have that, any kind of misinformation attributed to you is consumed readily.”

What doesn’t help, both Sinha and Wardle added, is the lack of understanding among regular people of how technology and privacy work. “Confusion is where misinformation thrives,” said Wardle, “so people saw the policy changes, leapt to conclusions, and unsurprisingly, many people believed the rumor.”

These patterns of misinformation that have thrived on WhatsApp for years have often led to harm. In 2013, a video went viral in Muzaffarnagar, a city in northern India that allegedly showed two young men being lynched, inciting riots between the Hindu and Muslim communities in which dozens of people died. A police investigation found that the video was over two years old and wasn’t even shot in India. In Brazil, fake news flooded the platform and was used to favor the far-right candidate Jair Bolsonaro, who won the country’s 2018 presidential election.

But the company didn’t address its misinformation problem seriously until 2018, when rumors about child kidnappers that swept through the platform led to a series of violent lynchings across India. In a statement released at the time, India’s IT ministry warned WhatsApp of legal action and said the company would be “treated as abettors” if it didn’t solve the problem, sending WhatsApp into crisis mode. It flew top executives from the company’s Menlo Park, California, headquarters to New Delhi to meet with government officials and journalists, and ran high-profile awareness campaigns around misinformation.

It also built new features into the app to directly counter misinformation for the first time, such as labeling forwarded messages and restricting the number of people or groups a piece of content could be forwarded to slow down viral content. In August last year, it also started letting people in a handful of countries upload the text of a message to Google to verify if a forward was fake. The feature isn't available to WhatsApp users in India yet.

Since then, the company has been working on a tool that would let users search images that they received in the app with a single tap in 2019, a move that would help people fact-check more easily. But nearly two years later, there is no sign of the feature, although a text version is available in over a dozen countries that do not, so far, include India.

“We are still working on the search tool feature,” a WhatsApp spokesperson told BuzzFeed News.

WhatsApp said the company wanted to provide more clarity around its new privacy policy. “We wish to reinforce that this update does not expand our ability to share data with Facebook. Our aim is to provide transparency and new options available to engage with businesses so they can serve their customers and grow,” the spokesperson said. “WhatsApp will always protect personal messages with end-to-end encryption so that neither WhatsApp nor Facebook can see them. We are working to address misinformation and remain available to answer any questions.”

This week, the company put a Status message, WhatsApp’s equivalent of a Facebook story, at the top of people’s Status section. Tapping on the Status revealed a series of messages from the company debunking the rumors.

“WhatsApp doesn’t share your contacts with Facebook,” the first one said. Two more Status updates clarified that WhatsApp can’t see people’s location and can’t read or listen to encrypted personal conversations. “We are committed to your privacy,” the last message said.

On Thursday, employees had multiple questions for Facebook CEO Mark Zuckerberg ahead of a weekly Q&A, according to internal communications viewed by BuzzFeed News. Some wanted to know whether the growing move to Signal and Telegram was affecting WhatsApp's usage and growth metrics. Others wanted the CEO to address whether or not Facebook used any metadata from WhatsApp to serve ads.

"Do you think we could have done a better job of clearly explaining [the new privacy policy] to users?" someone asked.

"Public is enraged @ WhatsApp PrivPolicy changes," another person commented. "Distrust in FB is so high we should be more careful about this."

Zuckerberg responded by saying that he did not think the company had handled the changes well.

“The short answer is no, I don't think we handled this as well as we should have,” he said. “And I think the team has already engaged in everything that's— and has a number of lessons in order to make sure that we do a better job going forward, not just on WhatsApp TOS's. But you know, we have we have other TOS updates for different apps and services. And we need to make sure we do better on those two. So that way, we minimize the amount of misinformation that gets that gets created — and the amount of — and minimize the amount of confusion that gets created.”

Ryan Mac contributed reporting.



Topics in this article

Skip to footer