Safari’s “Siri Suggested” Search Results Highlighted Conspiracy Sites And Fake News
The Siri Suggested recommendation feature inside Safari promoted Pizzagate videos, Holocaust denier articles, and debunked race science posts.
Apple’s Safari, one of the internet’s most popular web browsers, has been surfacing debunked conspiracies, shock videos, and false information via its “Siri Suggested Websites” feature. Such results raise questions about the company’s ability to monitor for low-quality information, and provide another example of the problems platforms run into when relying on algorithms to police the internet.
As of yesterday, if you typed “Pizzagate” into Apple’s Safari, the browser’s “Siri Suggested Website” prominently offered users a link to a YouTube video with the title “PIZZAGATE, BIGGEST SCANDAL EVER!!!” by conspiracy theorist David Seaman (the video doesn’t play, since Seaman’s channel was taken down for violating YouTube’s terms of service). The search results appeared on multiple versions of Safari.
Apple removed all examples of the questionable Siri Suggested sites provided to it by BuzzFeed News.
"Siri Suggested Websites come from content on the web and we provide curation to help avoid inappropriate sites. We also remove any inappropriate suggestions whenever we become aware of them, as we have with these. We will continue to work to provide high-quality results and users can email results they feel are inappropriate to email@example.com."
Safari isn’t the only browser to try to anticipate its users’ searches; Google has long delivered autocomplete search suggestions, which have occasionally been gamed or surfaced inappropriate content. However, Safari’s Siri Suggested Website feature goes one step further, autocompleting and suggesting a site for users to visit. Frequently, Siri Suggested dials up a Wikipedia page (as it does when you search for Apple CEO Tim Cook).
But when BuzzFeed News entered incomplete search terms that might suggest contentious or conspiratorial topics (as shown below), the search algorithms directed us toward low-quality websites, message boards, or YouTube conspiracy videos rather than reliable information or debunks about those topics. Meanwhile, Google does not feature such unreliable pages in its top search results.
Those suggested results matter since Safari is one of the internet’s most popular web browsers — some estimates suggest it has captured over 10% of the browser market share.
Other searches for conspiracies or popular fake news tropes returned similarly low-quality results.
The browser also surfaced the recently popular QAnon conspiracy. Typing “QAnon is real” into the search bar delivers an autocomplete for a YouTube video with the title “The Calm Before the Storm – QAnon Is the Real Deal.”
Google’s search results for the same phrase surface a number of articles debunking the conspiracy theory.
Safari’s autocomplete suggestion for “Hillary Clinton murder” surfaced a shoddy webpage about an alleged FBI cover-up related to the death of former deputy White House counsel Vince Foster. Google’s top result was a fact-check.
A search for “the Holocaust didn’t happen” (a well-known and debunked conspiracy theory) returns a link to a Holocaust denier page on the website 666ismoney.com.
Back in 2016, Google faced criticism for algorithmically promoting search results for Holocaust denier sites. However, the results have since been fixed:
BuzzFeed News found a number of other examples of Siri Suggested Websites surfacing debunked or conspiratorial information on topics including race science and vaccines.
On the same search phrase, Google offers users resources from government sources, like the CDC, in its results:
The list goes on.
Safari’s autocomplete for “whites are smarter th” delivered an answers.com user-generated page arguing that “God made white people, blacks came from monkeys.”
Siri Suggested Website also surfaced a link to Alex Jones’ Infowars site while autocompleting a search about Hillary Clinton. The site claims to offer the “real reason Hillary is attacking the alt-right.”
The Siri Suggested problem seems to stem from what researchers call a “data void,” which is what happens when a term doesn’t have “natural informative results” and manipulators seize upon it. Many of the sites surfaced by the Siri Suggested feature came from conspiracy or junk sites hastily assembled to fill that void.
This post was updated with additional comment from Apple.