Nextdoor launches anti-racist screens for community posts

The new feature alerts users if they've used "offensive or hurtful" language before they can post.
By Chase DiBenedetto  on 
Nextdoor launches anti-racist screens for community posts
Users are prompted to edit their comments before posting offensive language in the Nextdoor app. Credit: nextdoor

Community app Nextdoor launched its latest feature in a series of anti-racist initiatives, this time prompting users to think before they post "offensive or hurtful" language in their neighborhood forums. The new "anti-racism notification" specifically alerts users if they are trying to post discriminatory or overtly racist language, including posts with the phrases "All Lives Matter" and "Blue Lives Matter."

Nextdoor, founded in 2008, is a community networking app that connects users with others living in their area to share news, resources, and engage with one another using community forums. Designed to encourage discussion, the app has received ample criticism for not protecting its users from offensive language in past years, prompting the company to make large-scale commitments to creating safer, kinder forums.

In 2019, Nextdoor unveiled the Kindness Reminder — a feature that prompted commenters to review Community Guidelines and edit their posts before they went live if the posts were found to contain phrases frequently reported by other users in the past.

Earlier this year, the company committed to supporting the Black Lives Matter movement and its Black users through updated Community Guidelines, as well as an anti-racism resources hub. Nextdoor also pledged to circulate resources with community leaders on the app to facilitate more inclusive dialogue.

The new "anti-racism notification" is an expansion of the Kindness Reminder technology and the app's anti-racism initiatives. The company explained in an email to Mashable that the tool was created with oversight from "activists, academics, and experts to help understand how to combat incivility in neighborhood conversations."

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

If a user's comment is flagged for offending language, a pop-up notification pauses the user's post and prompts them to reconsider — the user can choose to edit their response immediately or continue posting if they feel the comment doesn't violate community guidelines. Alongside the notification's release, the app also published a series of blog posts explaining how to talk to your neighbors about race and expanded its anti-racism hub to include more information about implicit racism through unconscious biases and white privilege, and even how to engage with conversations about the Derek Chauvin trial.

According to Nextdoor, the Kindness Reminder led to a "30% reduction in uncivil content" posted on the app after its launch in 2019. The company hopes that a similar anti-racist tool will result in a steady decline of racist and discriminatory posts.

But, fundamentally, the new feature is more a test of Nextdoor's commitment to changing its long, problematic history of enabling spaces for "casual racial profiling" and other forms of covert racism, including collaborations with local police departments to facilitate crime reporting, despite user concerns. In 2020, the CEO of Nextdoor said that racism was no longer tolerated in the app, but users and critics alike contended the app wasn't living up to its big commitments. Since then, the company has continued expanding on its pledge to weed out racism in its forums.

The anti-racism notification attempts to address the issue of underreported (and then frequently unaddressed) racism by preventing such comments in the first place. And it could be a step in the right direction. Or it could be a continuation of the trend users saw last year — commitments that didn't fundamentally change the way the app is used or address the ways racism is perpetuated on a systemic level.

With AI providing only a prompt to discourage posting, there's concern that posts will be missed, that offending users will simply choose to ignore the prompts and post anyway, or that they'll just take their racism elsewhere. And, as was the case in previous years, community leaders will be a determining factor is how these features affect change. Known as "Neighborhood Leads and Community Reviewers" in the app, these users facilitate conversations and respond to users far quicker than any AI. As Nextdoor explains, these users ensure neighborhood guidelines are being followed. Leads welcome new members, moderate conversations, vote to remove comments, and promote other users to leadership roles. Only Nextdoor can remove offending members, however.

Ahead of the notification release, Nextdoor launched new trainings for Leads in collaboration with consulting group The New Quo. The training focuses on "inclusive moderation" strategies and explicitly includes implicit bias training. Initiatives like this get more at the heart of the issue: community accountability led by users and supported, at every step, by the company.

Whether or not these prompts will truly discourage offending posts is a test of time.

Related Video: How to know if you violated the First Amendment

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.


Recommended For You
X's Community Notes aren't flagging election misinformation
Two phones display the X/Twitter accounts of presidential candidates Donald Trump and Kamala Harris.


More teens think sex on TV isn't needed. They want to see friendship.
Young women watching movie at home

Why women behaving badly are dominating our screens
Naomi Ackie as Frida in "Blink Twice"; Katy O’Brian and Kristen Stewart in "Love Lies Bleeding"; Demi Moore in "The Substance."

I read all the community notes on Elon Musk's X account. Here's what I learned.
Elon Musk in his MAGA hat in front of a microphone.

Trending on Mashable
NYT Connections hints today: Clues, answers for December 15, 2024
A phone displaying the New York Times game 'Connections.'

Wordle today: Answer, hints for December 15
a phone displaying Wordle

NYT Connections hints today: Clues, answers for December 16, 2024
A phone displaying the New York Times game 'Connections.'


The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!