At the Occupy City Hall demonstration in New York in early July, protesters posted signs demanding that photographers receive consent from the people they took pictures of.
Kim Vivar, an organizer for NYC Shut It Down, which has been putting together protests for racial justice since 2015, told BuzzFeed News, "Any protest that we organize, we would rather that mainstream media is not there, to be perfectly honest."
Other protests have also been wary of the news media. As demonstrations against police brutality have spread across the country, so too have calls to protect the visual identity of protesters from police and the news media, by either not taking photographs in which people are identifiable or by blurring their faces after the photo is taken.
"Please, don’t post peoples faces onto the internet where white supremacist and the NYPD/CIA/FBI alike will use those images violently against those people," a reader recently wrote BuzzFeed News.
It's a new kind of request, one that I have never seen in my 10 years as a photo editor, despite covering demonstrations at Standing Rock, the Women's March, and protests against the pandemic lockdowns.
"Any protest that we organize, we would rather that mainstream media is not there, to be perfectly honest."
BuzzFeed News' policy is to show unaltered images, with the exception to protect the identity of minors in certain cases.
Other news organizations are receiving similar requests — and denying them as well.
"Blurring images is a form of photo manipulation that makes them less true, and is generally an unacceptable practice for documentary photography," wrote Kelly McBride, the Public Editor at NPR in a statement on June 18.
Two days later, Brent Lewis, a photo editor at the New York Times, wrote in Wired that he opposed hiding protesters' faces: "If you’re taking photos of Black bodies, it’s crucial to know the history of the image when it comes to Black uprisings," he said. "Knowing that ensures you know that by hiding Black bodies, you aren’t avoiding the problem, you’re part of it."
But the requests have continued.
Noah Morrison, a cofounder of the activist group ICP Center Blackness Now that is calling for photographers to prioritize Black life over their image-making, told BuzzFeed News that by "posting images of Black protesters, especially by non-Black photographers, while not working to understand how the spread of these images across social media, particularly Instagram, has aided and continues to aid federal and municipal law enforcement agencies in tracking down many of these protesters, is unacceptable.”
There are two concerns about faces being photographed and shared: one about how the images of violence play into stereotypes and the other about how police might use them for surveillance or to make arrests.
As professor and visual journalist Tara Pixley wrote in Nieman Reports on July 13, "The conflicting stakes of the work done by photojournalists has never been higher: the importance of expansive and thorough reporting on these protests is both integral to the widespread recognition of the Black Lives Matter movement and as a site of potential state subjugation. Each image photographed, published, and circulated has the capacity to both inform a viewing public and inform the police."
Black communities are more heavily policed than white ones, according to public opinion surveys and academic studies. Some demonstrators fear showing their faces, in part stemming from rumors following the deaths of six activists in Ferguson, Missouri — three by suicide, one by fentanyl overdose, and two found dead in torched cars.
In 2015, Baltimore police used facial recognition technology to identify protesters with outstanding bench warrants and arrest them directly from the crowd.
"The thing is it's not as if we’re all a bunch of celebrities that signed up for this public life. We are people fighting for our lives, and we are fighting for our lives because we are not safe," Vivar said.
"Each image photographed, published, and circulated has the capacity to both inform a viewing public and inform the police."
Vivar and other activists have asked people on social media users and in the news media to blur their faces, hoping it would make it harder for technology used by police to identify them. Experts said while that technique may help, blurring doesn't totally solve the problem. Further complicating matters is the fact that facial recognition is regulated at the local level. Some cities, like San Francisco, have banned the use of facial recognition technology. Others, like Detroit, promote it. Congress has yet to pass any legislation related to facial recognition.
"One of the most important things the US really needs is a robust federal data law," Allie Funk, a senior research analyst for technology and democracy at Freedom House, told BuzzFeed News. "So many of these issues around surveillance, facial recognition, and how police are using these tools, we either don't know or what we do know is really scary."
In general, facial recognition technology works by running individual photos against large databases of images. These databases the police use might be made up of driver's license images or booking photos for people who have already been arrested. The facial recognition software produces a list of potential matches. Depending on the service used, the quality of the matches can be very low, especially for women or people of color.
Jake Laperruque, senior counsel at the Constitution Project at the Project on Government Oversight, told BuzzFeed News that facial recognition technology is dangerous both when it works and when it doesn't.
"There are higher rates of misidentification for people of color. But even if we had facial recognition systems that had perfect matching capability and got the exact right person 100% of the time, it's still a really dangerous tool," he said.
But even if all the images were blurred, it would slow police identification efforts, not stop them completely — police could still use their own cameras, review closed-circuit television footage, or appeal to the public for their images. "With some of these free apps that just blur people's faces, the blur can be reversed," Jonathan Albright, research director at the Tow Center for Digital Journalism, told BuzzFeed News.
"The published work of journalists is, at this point, a small drop in the bucket compared with all the rest of that stuff," Daniel Kahn Gillmor, a senior staff technologist at the American Civil Liberties Union, told BuzzFeed News. "And it has a whole set of additional concerns around free press, and actually telling the story of what’s going on with the protests."
"It has a whole set of additional concerns around free press, and actually telling the story of what’s going on with the protests."
As a photo editor, here's what I think: When protesters say "no photos," there are two considerations at play: The first is how those images portray their subjects. The second is how the images might be exploited by police, whether by facial recognition or not.
In some ways, it's no surprise that journalists bear the brunt of these requests: We make ourselves accessible in a way the police, companies that make facial recognition technology, and Facebook, which owns Instagram, don't. Yelling at journalists is a lot easier than yelling at them.
And as the protests continue, and as norms and laws evolve, we're listening.
Emerson Sykes, an attorney at the American Civil Liberties Union, summed it up in a way: "As much as people might be wishing that certain photographs weren’t published in certain outlets, I think everyone acknowledges that we wouldn’t be where we are, we wouldn’t have made the progress that we’ve made, without the videos, without photos, without social media, and the spread of information."
Additional reporting by Caroline Haskins.