What gets us into trouble
is not what we don’t know.
It’s what we know for sure
that just ain’t so. (Attributed (probably wrongly) to Mark Twain)
A funny thing happened to me on the way to this blog: I learned that what I thought I knew about moderation of online content and Section 230 of The Communications Decency Act just ain’t so.
I set out to write a note about Nextdoor, the neighborhood news platform that has become infamous for its “Karen” problem. For the uninitiated, Nextdoor is a platform on which neighbors can connect for everything from yard sales and plumber recommendations to reporting of suspicious goings-on. (Full disclosure: we are on Nextdoor here at our lake house. Favorite topics include “what’s that funny smell?” to “if you are missing a boat, it’s in our cove.”)
It’s the neighborhood watch aspects that have gotten Nextdoor in trouble. Neighbors can and do post pictures of people they find suspicious in their enclaves, including recordings from their smart front door cameras. Nextdoor has a cozy relationship with local police departments. And the neighborhood sites are moderated by volunteer “Leads” who are not Nextdoor employees, representatives, or agents. Nextdoor makes sure to disclose that it does not “interview, run background checks on, monitor, supervise, or control Nextdoor members, including those who are granted Lead status.”
The Verge published an article last June reporting on what it referred to as Nextdoor’s calcified reputation as a “snitch app.” In response to calls for reform, not least from AOC, Nextdoor recently rolled out what it has called an “anti-racism” notification that uses AI to detect “certain phrases such as “All Lives Matter” or “Blue Lives Matter,” and prompts authors to consider editing their posts or comments before they go live.” Nextdoor won’t prevent you from publishing if you ignore the notification, but it says that the notification is meant to alert its users when what they write is likely to violate its anti-discrimination policies. Among other things, Nextdoor explicitly supports the Black Lives Matter movement and prohibits All Lives Matter and Blue Lives Matter content “when used to undermine racial equality or the [BLM] movement.”
When I read about Nextdoor’s notifications, I immediately thought, “well, now that Nextdoor has decided to moderate content, it’s given up its protection under Section 230.” I had fallen victim to something called “the illusion of explanatory depth”, the conceit that you understand something with much greater detail, depth and coherence that you really do. Looking only to confirm what I thought I already knew, I confidently searched for “Section 230” and “kindness reminder” and ran smack-dab into this acerbic but elegant riposte. Appropriately titled, Hello! You’ve Been Referred Here Because You’re Wrong About Section 230 Of The Communications Decency Act, it made for humiliating and illuminating reading.
Far from insulating providers and users of “interactive content services” (basically anyone who provides or enables computer access by multiple users to a computer server, e.g. Nextdoor, Facebook, et al.) if they attempt to moderate content; Section 230 in fact protects them when they choose to moderate. Section 230, I was humbled to learn, was enacted to overrule a 1995 court case that found that when a platform exercises editorial control over content, it opens itself to liability as a publisher. Exactly the trouble I wrongly thought Nextdoor was courting with its notifications. As the author of “Hello” put it:
. . . you can’t “lose” your Section 230 protections, especially not over your moderation choices (again, the law explicitly says that you cannot face liability for moderation choices, so stop trying to make it happen). If content is produced by someone else, the site is protected from lawsuit, thanks to Section 230. If the content is produced by the site, it is not. Moderating the content is not producing content, and so the mere act of moderation, whether neutral or not, does not make you lose 230 protections. That’s just not how it works.
So, I had it exactly backwards and going by the Hello article, I am not the only one. We thought it was important to report not just on Nextdoor’s efforts to moderate content but to help correct what appears to be a widely-held misunderstanding of Section 230.