Thousands of people in more than 100,000 US communities use Nextdoor, the app that allows neighbors to chat about everything from yard sales to bike theft, but which has also struggled with a reputation that it’s become a hub for racial profiling.
Last August, Nextdoor proudly announced a solution to this problem: an algorithmic form that prevents people from racially profiling, or making posts about crime and safety that focus on an individual’s race and nothing else. But almost nine months later, Nextdoor still hasn’t patched holes in its anti-racial-profiling system. In fact, until earlier this month, the company hadn’t even started rolling out the feature on mobile. As a result, racial profiling — which has the potential to put real neighbors in danger — continues to be a problem on Nextdoor.
Laurie Bertram Roberts, who lives with her seven kids in a majority-white neighborhood in Jackson, Mississippi, noticed two months ago that one of her neighbors had posted on Nextdoor under the heading “Beware.” The message warned that two black men were going door to door in the neighborhood asking if they could cut grass for money. “May be harmless,” the message poster said. “Just be wary of letting them inside!” An hour after the post went up on Nextdoor, another neighbor who saw the men walking up someone’s driveway called the police.
The incident incensed Roberts, who is black, and she replied to the post. “You just painted every pair of black males in [the neighborhood] as suspects, including my son and his friends who may be on their way to the store minding their business,” she wrote, according to screenshots reviewed by BuzzFeed News. “We have a right to walk around without being deemed suspects because 2 dudes scared you.”
A member of the Jackson Police Department wasn’t able to locate any additional information about this incident, but said going door to door looking for work isn’t a criminal activity.
Nextdoor has been lauded by local officials in Oakland for its efforts to stop racial profiling. It even won an award from the Bay Area chapter of 100 Black Men, a national civic organization for professional black men. “We created the company because we believe in bringing people together,” said CEO Nirav Tolia in a CBS This Morning interview last year. “In terms of racism, it’s one of the most divisive things in our community today. We want to be part of that solution.”
But a BuzzFeed News review of various local Nextdoor groups suggests there’s still a lot of racial profiling on Nextdoor.
For example, here’s a post that a user near St. Louis saw in January.
Here’s another one, in the Bay Area in December:
And here’s a third example from Florida, shared on Twitter in February. Nextdoor’s CEO responded to this post, saying in a tweet that the “work in this area will never be done.”
Part of the reason the problem persists is that racism is pervasive and pernicious, and no piece of software is going to stop it. But while Nextdoor’s algorithmic solution certainly forces users to stop and think about the role race plays in their analysis of a suspicious situation, it was deployed with significant weak points that have made it less effective than it originally seemed.
The features Nextdoor built to prevent racial profiling only rolled out in the app on Android phones earlier this month, and they still aren’t in use on iOS. (The company said it plans to roll out the algorithm on iOS on May 24.) And people can still post whatever they want in comments and Urgent Alerts, which are short, time-sensitive messages distributed immediately by text message or email to the whole neighborhood.
In an email statement to BuzzFeed News, a Nextdoor spokesperson said all the instances of racial profiling cited here either were “not alerted” to Nextdoor or were flagged by a neighbor and “handled appropriately by our support team,” which may or may not involve removing a post. The spokesperson said the company does not comment on individual members or posts.
Nextdoor said the “vast majority of the instances of racial profiling have been eliminated due to the actions we have taken.” A company spokesperson declined to explain the rationale for or methodology behind that conclusion. Nextdoor also declined to share what percentage of content posted to its site comes through mobile apps.
Nextdoor acknowledged back in November that racial profiling was still possible in Crime & Safety posts on mobile apps, and in urgent alerts and replies, but more than six months later, the company is still searching for a way to prevent it. “With the support of our partners, community groups, and experts in the field, we will continue to address this issue and specific instances as they come up,” Nextdoor’s statement pledges.
Individual users aren’t the only ones who have noticed that racial profiling persists on Nextdoor. Shikira Porter is a representative for Neighbors for Racial Justice, one of two community groups that’s been working with the company to address racism on its platform. (Nextdoor also brought on Debo Adegbile, formerly of the NAACP Legal Defense Fund, and Grande Lum of the Divided Community Project as national advisers.)
Porter said she presented Nextdoor with a 15-page document of ongoing problems and possible solutions during a conference call in November, but the company took until January to make any changes. “They tweaked one thing out of all the things we listed were problematic,” she said. The result is that Nextdoor’s system still has “all of these major holes.”
Porter said her Oakland neighborhood Nextdoor group still sees at least one instance of racial profiling a month.
But even when Nextdoor fully deploys its racism-fighting algorithm in its app, there will always be some racial profiling the system won’t catch. To make a judgment call in those instances, Nextdoor relies on local neighborhood leads — frequent users who are nominated by their neighbors to be group moderators. Porter said Neighbors for Racial Justice stressed to Nextdoor the importance of “mandatory and comprehensive Leads training” on the definition and risks of racial profiling.
In August, during a meeting at Nextdoor’s headquarters, a spokesperson said the company was thinking about ways to scale up the racial profiling and unconscious bias trainings it had done internally to all of its national neighborhood leads. Asked for details on the program, Nextdoor’s spokesperson said some neighborhood leads in Oakland had received bias training, a program the company is working to put online “to provide to our Leads across the country.”
One of the problems with putting volunteers in charge of policing racial profiling on Nextdoor, says Jackson resident Tom Head, is that not all neighbors agree with Nextdoor’s stance on racial profiling. For example, when one of Head’s neighbors recently posted an urgent alert saying a black man was sitting in a parked car in a driveway, the neighborhood lead responded to the post by clicking "Thank," which is the Nextdoor equivalent of Facebook’s "like."
"I haven't actually seen a lead post a message where they said, 'I will not enforce the racial profiling guidelines,' but I have certainly seen leads participate in threads where [the guidelines] were being ridiculed,” said Head. "In majority-white communities in Mississippi, the idea of opposing racial profiling as a matter of policy is not necessarily a popular one."
Head added that even if a lead is personally against racial profiling, “enforcing these policies against their neighbors, coworkers, employers, and clients” can have unwanted social and financial consequences. "If you look at the people who signed up early and did the most invites and ended up as leads, it incentivizes, for example, realtors," he said. "If you're a realtor, and you're selling houses in a neighborhood, you have to maintain relationships with some of the people who might be posting these objectionable posts. It can become financially risky to offend these people by taking socially unpopular positions.”
A solution to this problem, Head suggested, would be to have Nextdoor employees intervene when a lead declines to take action when another neighbor reports a post for racial profiling. The good thing about leads being local community members, Head said, is that “they’re usually willing to do it for free.” But, he continued, it also “creates a problem in that, if you have a national policy that a local neighborhood doesn’t like, it’s very hard for the lead to enforce it.”
In August, Nextdoor said all posts flagged for racial profiling would be directly reported to a team of two dozen trained customer support staffers. But Head said that when he reported the urgent alert about the man sitting in a car to Nextdoor’s support team, "nothing came of it."
“It’s the people who are using the platform that I’m most frustrated about,” said Rebekah Goode-Peoples, a former Nextdoor user in Atlanta. “My greatest disappointment was realizing how bigoted many of my neighbors were.” Goode-Peoples deleted her Nextdoor account in September of last year, after reporting an incident in which a black female delivery worker was blamed for her own mugging because she willingly delivered food to a black neighborhood and didn’t bring a gun. Goode-Peoples never heard back from Nextdoor about her complaint.
"We remain completely committed to eliminating racial profiling on Nextdoor," a company spokesperson told BuzzFeed News.
Laurie Bertram Roberts — the mom in Jackson — said the Nextdoor post about two black men going door to door looking for work was eventually taken down, after a few people shared a screenshot on Facebook. (However, sharing Nextdoor content on other social media is against Nextdoor’s rules, because the posts are linked to people’s real names and locations.)
“I only became active on Nextdoor again after the new rules because I thought it would be better,” she said over Twitter DM. “I stay on because I really do fear the day someone sends an alert and it's my kids they're describing as suspicious.”