Section 230(c)(1) of the Communications Decency Act (codified at 47 U.S.C. § 230 (“Section 230”)) has long been credited for the boom of user generated content on the internet — the crux of social media that has driven the online environment for decades. Section 230 grants immunity to companies that provide user content platforms, essentially stating that the companies cannot be held liable for the content their users publish. But the bounds of that immunity has been repeatedly challenged on multiple fronts, with lines starting to be drawn in the sand. The protection has been hotly debated by Congress in recent years, and the Ninth Circuit recently joined the fray by rejecting Snap, Inc.’s claim that it is immune from claims that one of the filters offered on its Snapchat platform caused the death of three of its users.
Surge in Scrutiny by Law and Policymakers
The U.S. Congress and the Department of Justice (“DOJ”) have both embarked on reviews of Section 230 in recent years, largely in response to information and potential disinformation posted on (or removed from) social media sites concerning the recent elections and coronavirus pandemic. (See a previous V&E post on the topic here). In June 2020, the DOJ published a report on Section 230, recommending the replacement of “vague terminology” and “catch-all” language in order to encourage transparency and accountability among online platforms, rather than enabling them to “hide behind blanket Section 230 protections.” Politicians on both sides of the aisle have also criticized Section 230, and seven bills have been introduced to diminish or repeal it during the 117th Congress alone.1
That push has continued. On April 27, 2021, the Senate Judiciary Subcommittee on Privacy, Technology and the Law held a hearing with executives from major online platforms and discussed Section 230. Senator Chuck Grassley (R-IA) — echoing statements made by German Chancellor Angela Merkel and other world leaders — pointed to the powers bestowed on online platforms by Section 230, stating that “[t]his immunity combined with monopoly allows them to censor, block, and ban whatever they want. We must look at the power and control that a handful of companies have over speech, and their silencing voices with which they disagree.” Company executives were asked about their policies and efforts to monitor content posted on their sites, and at one point, Senator John N. Kennedy (R-LA) stated that he and other legislators are working on bills that would remove Section 230 immunity specifically from social platforms that use a practice called “optimizing for engagement” — algorithms that present “hot button” content to users based on their prior engagement with the site. While the central focus of the hearing was not Section 230, the subcommittee made it clear that there is a push to change the law.
U.S. Representatives Frank Pallone Jr. (D-NJ) and Cathy McMorris Rodgers (R-WA) also wrote a recent letter signaling potential Section 230 modifications, to Katherine Tai, the U.S. Trade Representative responsible for crafting trade deals the United States enters with other nations. The May 3, 2021 letter requested that Tai refrain from incorporating language into future trade deals that requires the party nations to implement Section 230-type immunity, “while the Congress is seriously considering modifications to this statute.” Such trade deal obligations could be a roadblock preventing Congress from making substantive changes to the law, and would increase the scope of the immunity to include other jurisdictions. The letter references the 2018 U.S.-Mexico-Canada Agreement which incorporated language that required the trading partners to adopt online platform immunity provisions. Reps. Pallone and Rodgers’s letter states that “the effects of Section 230 and the appropriate role of such a liability shield have become the subject of much debate in recent years. . . . [W]e find it inappropriate for the United States to export language mirroring Section 230 while such serious policy discussions are ongoing.”
Judicially Created Limits
The courts have also entered the fray. Most recently, on May 4, 2021, the Ninth Circuit reversed and remanded a district court’s decision in Lemmon v. Snap, Inc., holding that Snap, Inc. (“Snap”) was not necessarily shielded from liability by Section 230 in a case where three young men were killed while driving at high speeds using a “Speed Filter” on the widely popular social media app, Snapchat.
The parents sued Snap alleging that its “Speed Filter” was negligently designed and caused their sons’ deaths by encouraging them to drive at dangerous speeds. The lower court had found that Snap was immune from liability due to Section 230, which states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The Ninth Circuit found that the parents’ focus on Snap’s negligent feature design, rather than its role as a “publisher or speaker” of the content users were creating, meant that their claims were not subject to the immunity of Section 230. The Lemmon court restated the test for whether an online platform enjoys such immunity, holding that immunity is only present if the platform is “(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a publisher or speaker (3) of information provided by another information content provider.” In this case, however, it was not the act of publishing the user generated content at issue, but Snap’s product design that was foreseeably dangerous and invited improper content. The Court noted that Snapchat has an award system based on “snaps” that users send, and Snap allegedly knew or should have known that the Speed Filter incentivized young drivers to drive at dangerous speeds.
While the “negligent product design” theory is unique to Snap in many ways, this is not the first time Section 230 immunity has been denied by a court where a company has gone beyond the role of “publisher” and instead invited foreseeably improper content. In a well-known lawsuit2 between Quiznos and Subway, Quiznos created a “Quiznos v. Subway TV Ad Challenge” that solicited consumers to post online videos about “why [they] think Quiznos is better.” Subway sued Quiznos, arguing in part that the promotion invited content that was false or misleading about their sandwiches. Quiznos invoked Section 230 immunity in a motion for summary judgment, but the Connecticut district court held that the protection would not apply if Quiznos acted as more than merely a publisher of user-generated content, and was instead “actively responsible for the creation and development of disparaging representations about Subway.” The court determined that was a question for the jury (though this question was never answered, because the case was settled out of court soon after).
Section 230 does not pertain only to the mega social media companies. These days, nearly every company is involved in social media and/or user-generated content. Changes to Section 230, whether driven by official legislative revisions or evolving judicial standards, will impact not only online platforms, but every company that relies on those channels to communicate and interact with their customers.
In light of the continued scrutiny of Section 230, businesses that host third-party content should consider:
- What obligations must you meet to monitor content created by your customers?
- What options and rights do you have to remove harmful or offensive content, or prevent your ads from appearing on social media platforms alongside such material?
- Are you employing a product feature or promotion that crosses the line by inviting improper or foreseeably dangerous content?
As lawmakers continue to push to change Section 230, and courts potentially endorse plaintiff or government enforcement theories that side-step the immunity Section 230 provides, businesses should be diligent in monitoring and getting out in front of any changes to the law.
1See, e.g., H.R. 83, 117th Cong. (2021); H.R. 285, 117th Cong. (2021); S. 27, 117th Cong. (2021).
2Doctor’s Assocs., Inc. v. QIP Holder LLC, 2010 WL 669870 (D. Conn. Feb. 19, 2010).