It is now well understood that online forums and social networks that require users to identify themselves with their real names enjoy a higher level of decorum than those which allow anonymity.
Think things get nasty on Facebook? Spend an hour or two lurking on Reddit — or, worse yet, take a spin through comments on just about any YouTube video — and even your crazy uncle’s looniest status updates start to feel wholesome by comparison.
If a real-name connection to a real-life community fosters cordiality, then it’s no wonder that the hyper-local, San Francisco-based platform Nextdoor doesn’t have the kind of trolling issues that made 4chan the go-to website for pedophiles and QAnon devotees.
That’s not to say that the same debates found on other platforms aren’t also raging on the neighborhood networking website. Yes, users feud over everything from the ethics of Black Lives Matter protests to street cleaning. But nihilistic memes and dark web conspiracy theories aren’t the real problem on Nextdoor.
In June, everyone from Congresswoman Alexandria Ocasio Cortez to a dedicated satirical Twitter account sounded the alarm that Nextdoor was being used by affluent whites to marginalize people of color and stigmatize the poor. For years, people have criticized the platform for enabling racial profiling, calling out posts like a 2017 warning from a Mississippi user, which alerted the neighborhood to two Black men walking around offering to cut the grass. “May be harmless,” the post reads. “Just be wary of letting them inside!”
In its latest response to concerns like these, Nextdoor says it is tweaking its content moderation processes, recruiting more Black users to positions of power, and cutting some features which made it easier for users to forward Nextdoor posts to the police. However, other elements of the site make sociologists think that tribalism might simply be built in.
“People have always dealt with that one neighbor who’s always upset about stuff — that isn’t new” says Coye Cheshire, a professor at the UC Berkeley School of Information. However, he observes, these kinds of conversations used to take place in driveways and at city council meetings. By bringing that all online, he argues that Nextdoor’s creators set themselves up to have to deal with the same problems communities are naturally plagued with, including racist and classist gadflys.
Research also tells us that the most active users on platforms like Nextdoor tend to be the most biased to begin with. Gemma Galdon Clavell, a Spanish policy analyst who has developed an audit for reducing bias and improving algorithmic predictions, has seen this trend when conducting research on safety-focused WhatsApp and Facebook groups. “There’s a tendency of the [active] group to see themselves as the only legitimate users of the space, and everyone who’s not part of the group immediately becomes suspect,” she says.
This dynamic fuels a kind of behavior familiar to anyone who has ever dealt with a nosy, entitled neighbor — and it tends to walk hand-in-hand with a specific kind of coded language, which is used to dismiss those who don’t live up to their standards.
“Someone on there found out where I worked,” Sierra Rivera, a rising sophomore at SFSU and lifelong Mission District resident, says of another Nextdoor user who she claims doxed her on account of her personal opinions. Sierra doesn’t shy away from sharing her political views online, and has made friends and enemies on the platform.
In this instance, a user posted the name of the chain ice cream shop where she works, (albeit the wrong location) and called for others to visit her store and complain to her manager about what she had said online. Rivera’s account was suspended shortly thereafter, and she believes this is because she alone was blamed for the rising tensions. Reading over the thread, users questioned her “education” and speculated as to whether she pays “property taxes.” They even brought race into the conversation, she says.
“They would refer to me as a Latina,” Sierra says. “How does that relate to this conversation?”
Police also have a notably visible presence on Nextdoor: posts from public agencies are highlighted and often shown at the top of the news feed. Additionally, up until two weeks ago, police departments had the option to enable a “forward to police” button, which would allow post authors to send photos, videos, and other evidence to their local department with a click. Though the feature was never active in San Francisco, Public Information Officer for SFPD Michael Andraychak says the department does send posts they are “tagged” in or messaged about to investigators for follow-up. Otherwise, he says SFPD doesn’t use the platform for much more than sharing their own posts.
Edwin Lin, a social media and virtual communities expert in the Sociology department at UC Berkeley, notes that a Twitter user could just as easily send a direct message to SFPD as one on Nextdoor. Additionally, whereas police departments cannot freely scroll through content on Nextdoor, they can do so on most other platforms. The prioritization of police posts on Nextdoor, and the recently-deleted “forward to police” feature, however, make him think the app had a different vision of police involvement at its inception.
“They were probably geared toward issues of safety, and therefore saw the police as a partner to what their app was trying to do. Making those connections [with the police] would change who would be more likely to get involved in the app, even in the beginning” he says. He also adds that a growing cultural understanding that Nextdoor is the “Karen” platform, and even had police-facing features to begin with, will continue to attract people who are fond of the police to the site while repelling others.
More important than police activity on the app, however, is Nextdoor’s algorithm. Like all social media platforms, Nextdoor is designed to increase the time users spend on the website by notifying users of content it thinks they want to see. However, when people use a platform for everything from reporting burglaries to sharing their latest batch of homemade pickles, it’s easy to guess which posts will gain the most attention.
The most controversial crime-related posts get the most engagement. In turn, these posts are featured the most in users’ notifications because the algorithm knows those posts attract lots of likes, comments, and clicks.
“The problem is the same as other social media: that you can accidentally end up focused on these bubbles of information,” says Cheshire. “Even if you know that, it’s hard to not fall victim to ‘oh, everyone in my community is concerned about this issue.”
Finally, Nextdoor moderates content through volunteer community leads, which can, in the worst cases, exacerbate power dynamics in a neighborhood. These leads vote to take down content they believe violates community guidelines, verify new users, and promote other users to the status of lead.
Nextdoor reserves the right to appoint leads as they see fit, and, according to their website, generally choose people who hold “real-world leadership positions within the neighborhood,” like homeowner association leaders and Neighborhood Watch Captains — organizations which are predominantly white and upper-class and in the worst cases may perpetuate racial bias on the platform.
Regardless, leads alike are overwhelmed with moderation requests in times of tension when fights lead to users reporting other users en masse.
“There’s still comments sitting with 30-40 flags” says Joyce Book, one of Nextdoor’s founding volunteer leads in Potrero Hill. Book says she’s been overwhelmed ever since the start of COVID-19. When leads can’t get to moderating posts, she suspects posts with an excess of reports might get taken down by the Nextdoor system. “It comes down to an algorithm I have nothing to do with,” she says.
In Nextdoor’s communities in the Mission, multiple posts supportive of the Black Lives Matter protests have been removed in recent weeks. Similar incidents have been reported in San Diego and Atlanta, despite Nextdoor emailing leads in early June to explicitly state that “conversations related to racial inequality and Black Lives Matter are allowed on Nextdoor.”
Nextdoor is making moves to address racial profiling amongst moderators. Nextdoor CEO Sarah Friar told NPR that censorship of Black Lives Matter posts “was really our fault,” saying that many leads were following outdated advice that they remove posts about “national conversations” and that it was Nextdoor’s responsibility to clarify the issue early on. Nextdoor will soon enlist moderators in unconscious bias training and launch a campaign to recruit black moderators, she says, while improving other surveillance like AI to better detect racist and “coded racist” content. However, overwhelmed moderators, racism directed toward non-Black people of color, and the prioritization of police posts remain unaddressed at the company which is led predominantly by white executives.
Ultimately, the Nextdoor platform may not survive if only a small segment of the real-life neighborhood feels welcome and respected enough to stay online. To fix the problem, Clavell says the company should start by publicly auditing their platform on an ongoing basis, and track it’s psychological impact on users.
“You want to know those things before they become a problem,” she says, referencing Zoom’s admission that they hadn’t thought through security concerns before launching. “You want to avoid that situation where you have to make all these improvements on-the-go. Zoom managed to do that, but other companies collapsed.” Nextdoor appears to be making the same gamble.
“Giving people some kind of agency and control over their local environments — that’s a good thing” summarizes Cheshire. The stated mission of Nextdoor to “cultivate a kinder world” is admirable, and a platform meant to connect people to neighborhood information, goods, and services can be extremely useful. “The downside, of course, is that some of the most hateful things I’ve ever seen said have happened on Nextdoor,” says Coye.