Six Principles for Misunderstanding Free Speech and Section 230 | #computers | #computerprotection


Earlier this month, Heartland Institute President James Taylor published a paper titled, “Six Principles for State Legislators Seeking to Protect Free Speech on Social Media Platforms.” Each principle omits important legal facts, betrays a confusion about the current market, or reveals a misunderstanding of the history behind the law at issue: Section 230 of the Communications Decency Act, which protects interactive computers services (including social media platforms, comments sections, and websites such as Yelp) from being held civilly liable for content posted by third party users. There are some narrow exceptions, but generally speaking the law shields Facebook, for example, from being held liable for its users posts.

As Parler returns to the internet via friendly webhosts and domain registrars shielded by Section 230, we continue to believe that critics of the legislation have things exactly backwards: Section 230 has been, and continues to be, a tremendous boon to free speech. By putting property rights first, it has allowed alternative providers and platforms to carry unpopular speech without fear, ensuring that the internet remains “a forum for a true diversity of political discourse.” Efforts to alter or reform Section 230 risk eroding its universal protections, advantaging large, restrictive incumbents who can more easily bear regulatory burdens and litigation costs. It behooves us to continue allowing private solutions to develop before leaping to legislative remedies that, whatever their intention, may make the environment less amenable to free expression.

Principle #1: “Big‐​tech companies operate and thrive in a government‐​corrupted market, exploit the corrupted market to their advantage, and often oppose free‐​market reforms. Therefore, this cartel is in no position to object to free‐​speech protections in the name of ‘free markets.“ ‘

It is true that “Big Tech” companies do not operate in a free market and that they support policies that further entrench themselves. In this regard, social media companies are much like car makers, defense contractors, health insurance companies, airlines, banks, and farmers, except that it is much easier to build a new social media company than it is to build a new bank. It is telling that both Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey expressed support for amending Section 230 at a Senate hearing last year. This is no surprise to libertarians, who have warned of the dangers of regulatory capture for years. But Section 230 that has not made the market for internet services unfree. Quite the opposite. One of Section 230’s greatest benefits is that it places new market entrants and incumbents on an equal playing field. We all want a market where someone (or a group of people) with an idea better than Facebook can compete with Facebook without having to pay the enormous costs associated with preparing for legal battles linked to user content. The Facebooks, Googles, and Amazons of the world will be able to comply with whatever Section 230 changes Congress passes. The same cannot be said of much smaller competitors hoping to knock these companies off their perch.

A stubborn feature of ongoing debates about social media companies is the mischaracterization of the market they operate in. Facebook, Twitter, Google, and YouTube are sometimes incorrectly referred to as “monopolies” when they are in fact competitors. Facebook does not sell social media, and Google does not sell search. Rather, Facebook and Google compete with each other for (among other things) digital advertising. They also compete with a host of advertisers, from newspapers and billboards to radio, cable, and broadcast television.

Principle #2: “Shutting down an entire platform or blocking a particular user because of concerns about vague and amorphous “community standards” or anything other than sexually obscene, excessively violent, or indisputably criminal content is not in line with an originalist understanding of federal law. States must act to protect their residents’ speech rights when the federal government fails to do so.”

This principle is factually inaccurate. Both of Section 230’s authors, Sen. Ron Wyden (D-OR) and former Rep. Chris Cox (R-CA) have not been shy about sharing why they wrote the law. In comments to the FCC last year Wyden and Cox wrote: “Section 230 itself states the congressional purpose of ensuring that the internet remains ‘a global forum for a true diversity of political discourse.’ In our view as the law’s authors, this requires that government allow a thousand flowers to bloom—not that a single website has to represent every conceivable point of view.”

Wyden and Cox intended for interactive computer services to develop and implement their own content moderation rules based on their values. Section 230 protects a diverse internet, not just “Big Tech.” Attempts to mandate that all platforms serve all users, or adopt similar rules, would limit the tremendous variety of niche services and websites that internet users currently enjoy.

In Principle #2 Taylor repeats the false “publisher” v. “platform” distinction, which does not exist in Section 230. Indeed, traditional publishers, such as The Wall Street Journal, enjoy Section 230 protections because their comments sections are an “interactive computer service” under the law.

Principle #3: Free‐​speech rights should outweigh corrupt market protections

As mentioned above, we believe that Section 230 enables for a competitive social media market. Social media is much larger than Silicon Valley household name giants. To portray Section 230 as a market protection for market incumbents is to miss the law’s valuable pro‐​competition implications.

Furthermore, provider competition aside, the greatest beneficiaries of Section 230 are internet users. Without Section 230, speech tools, from heavily moderated products like Facebook to relatively hands‐​off platforms like WordPress, would have to be designed to preclude misuse, limiting their potential. Users banned from one or several platforms may still make use of a plethora of publishing tools unavailable to speakers in the past. In this sense, Section 230 functions similarly to the Protection of Lawful Commerce in Arms Act, which prevents firearms manufacturers from being held liable for misuse of their products. As one of us wrote in a recent Cato paper, “While clearly intended to safeguard the rights of individual citizens, the PLCAA, like Section 230, offers protection only to firms. Nevertheless, in both cases the legislations’ benefits accrue to individuals because the effective exercise of their rights hinges on the availability of certain products.”

Principle #4: Shutting down an entire platform because of concerns about criminal activity conducted by a small percentage of a platform’s users is an overly intrusive, harmful, and unnecessary action.

This principle seems to overlook the fact that there is an exception in Section 230 for violations of federal criminal law. If a user posts content on a social media site that violates federal criminal law the social media site can be held liable.

In the specific instance of Amazon severing its relationship with Parler and Apple and Google removing Parler from its app stores there were concerns not only about the service being used to orchestrate serious crimes, but also worries associated with racist rhetoric. In a free market private businesses should remain free to disassociate with content they find objectionable.

However one feels about Amazon’s actions, it is important to recognize that Section 230 is vital to Parler, both for its own operations, and to protect its web hosts. The statute protects free association and dissociation equally. Parler has moved its domain to Epik, a firm that has increasingly served as a host of last resort for the right. Without Section 230, Epik would face constant, costly lawsuits over its decision to host Parler, Gab, and a host of other deplatformed sites. Indeed, while centralized infrastructural choke points are a cause for concern no matter who controls them, changes to Section 230 would serve only to strangle alternatives in the crib.

Principle #5: Section 230 does not protect internet social media platforms from blocking anything other than activity that falls under the narrow categories of sexually obscene, harassing, and/​or excessively violent material.

This is false and at odds with the original intent of the law, the justifications for removal explicitly listed in Section 230 (c)(2)(a), and the holdings of numerous courts that have considered Section 230 suits.

Principle #6: Banning a particular user for anything other than repeatedly posting sexually obscene, harassing, or excessively violent material exceeds Section 230’s protections and should be subject to legislative action and civil causes of action.

This principle, similar to principle #5, is mistaken. Section 230 was not written on the understanding that only repeat posters of a narrow category of content would have their accounts removed. Claiming otherwise is contrary to any plain reading of the statute, the stated intentions of the law, and the findings of courts across the country that have considered must‐​carry suits.

Conclusion

It is the First Amendment, not Section 230, that allows social media companies to disassociate with users and moderate content. Although often discussed in debates about the state of social media, Section 230 is a liability shield, not a law that allows websites to remove content. Section 230 changes will do nothing to infringe on website’s right under the First Amendment to remove content. They will likely make Silicon Valley companies more dominant.

At the heart of freedom of speech is the freedom of association. The right of a white supremacist to write and submit an op‐​ed to The Washington Post is as important a right as the right of The Washington Post to decline to publish the oped. We see no contradiction in defending the right of individuals to write what they want while also defending the right of private companies to disassociate with content they consider objectionable. The freedom of speech does not entail an entitlement to a platform for that speech.

Fortunately, the Internet is much larger than Silicon Valley’s “Big Tech” giants. Thanks in large part to Section 230, there are many, many venues for Americans to write and share their beliefs. Mastodon, the InterPlanetary File System, LBRY, Bitchute, Diaspora, are only some examples of alternatives to Facebook, Twitter, and YouTube that offer users a variety of different content moderation guidelines. We fear that Section 230 changes will make it harder for these services to effectively compete with market incumbents.



Original Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

22 − = 20