Google and Microsoft have agreed to block images of child abuse through their search engines.
Google and Microsoft's Bing account for 95% of all search traffic, and have worked together to create new algorithms that will prevent illegal search results of child abuse. The BBC reports that at least 100,000 search terms that generate illegal images and material will now produce zero search results, and will also trigger a warning that images of child abuse are illegal.
Google's Executive Chairman Eric Schmidt wrote about the changes in The Daily Mail, saying that Google has over 200 people working on the problem and that Microsoft deserves "a lot of credit" for developing and sharing its picture detection technology.
We've fine tuned Google Search to prevent links to child sexual abuse material from appearing in our results. While no algorithm is perfect – and Google cannot prevent paedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids.
Schmidt added that YouTube engineers have also created a new technology that will identify child abuse videos, which they hope to make available to other internet companies and child safety organisations in the new year.
U.K. Prime Minister David Cameron called for the two companies to block such results earlier this year, saying that he "does not accept" that Microsoft and Google's saying they wouldn't block search results.
Cameron has praised Google and Microsoft's announcement, saying that both companies had "come a long way."
The new restrictions will apply to English-speaking countries, and will expand to more than 150 other languages over the next six months.
Cate Sevilla is the UK managing editor for BuzzFeed and is based in London.
Got a confidential tip? Submit it here.