Insights Into Texas’ Content Moderation Law – Social Media | #socialmedia

To print this article, all you need is to be registered or login on

On May 31, the US Supreme Court reinstated an injunction first implemented in
December 2021 regarding HB 20, the Texas law prohibiting social media
companies from certain content moderation based on viewpoints. Its
decision comes just over a week after the 11th Circuit
upheld an injunction barring enforcement of Florida’s law that
banned social media platforms from removing political candidates.

The Supreme Court’s 5-4 decision stemmed from NetChoice LLC
and the Computer and Communications Industry Association’s
appeal of the Fifth Circuit decision staying the original
injunction issued by the District Court. Their argument was that
rather than promoting free speech, Texas’ law is requiring
social media platforms to promote speech they don’t agree with.
Justice Alito in his dissent discussed social media’s
transformative impact on communication and stated that while he
does not have “a definitive view on the novel legal questions
that arise from Texas’ decision to address the ‘changing
social and economic’ conditions it perceives,” he believes
the District Court’s original injunction was a
“significant intrusion on state sovereignty”.

So, for now Texas cannot enforce this statute. Will they if the
injunction is lifted? The Texas Attorney General is empowered to
bring an action to enjoin a violation and collect attorneys’
fees and investigative costs. We know that office is particularly
focused on Big Tech issues, having sued multiple targets on issues
ranging from antitrust, to privacy, to deceptive advertising. The
reinstatement of the injunction presents an opportunity to pause
and take a closer look at HB 20 and what future enforcement could
look like in Texas and other states.

Social Media Platforms General Provisions

The new statutes include more than just the controversial
censorship chapter. A social media platform is defined as “an
Internet website or application that is open to the public, allows
a user to create an account, and enables users to communicate with
other users for the primary purpose of posting information,
comments, messages or images.” The exclusions from the
definition includes ISPs, email services, and direct news sources
with comment features. Further the applicability of these statutes
applies only to those who have more than 50 million active users in
the United States in a month, indicating a focus on “Big
Tech” only.

These platforms are required to disclose on a public website
accurate information about how it curates, targets, places,
moderates, and promotes content including its own; how it uses
algorithms to determine results on the platform; and how it
provides users’ performance data.

Further, the law requires the platform to publish an easily
accessible acceptable use policy which informs users about the
content allowed, explains compliance with the policy, and the means
to notify the platform of violations. The platform must make a good
faith effort to evaluate complaints of illegal activity within 48
hours with reasonable exceptions.

They also must report every 6 months on the total number of
instances of policy violations by users, employees, or automated
tools they were alerted to and how many times the platform took
action including content removal, demonetization, deprioritization,
or assessment; and account suspension, removal, or other action
consistent with their acceptable use policy.

When a platform takes an action under their policy they must
notify the user who provided the content and explain the reason,
allow an appeal, and provide written notice of the outcome of the
appeal. The exception to this is if they are unable to contact the
user after reasonable steps or know they the content is part of an
ongoing law enforcement investigation.

Discourse on Social Media Platforms

Censorship under this statute includes any action to alter or
remove or deny access to or otherwise discriminate against
expression. Expression includes any perceivable communication.
Social media platforms are prohibited from censoring a user or
their expression or ability to receive an expression based on
viewpoint or geographic location, regardless of the medium the
viewpoint is expressed
. This applies to users who reside in
Texas, do business in Texas, or share or receive expressions in
this state.

They are allowed to censor anything specifically authorized by
federal law, at the request of an organization to protect
exploitation, that directly incites criminal activity or has
specific threats of violence, or is unlawful. Additionally, users
can censor expression on their own page. Users may also seek
injunctive relief and the statute specifically notes that users may
bring action regardless of whether a court has enjoined the AG or
has declared the chapter unconstitutional. The court can hold the
platform in contempt if they fail to comply with court order and
can use “all lawful measures” to secure compliance
including “daily penalties sufficient to secure immediate
compliance.” Platforms cannot enforce a waiver of the
protections of this statute by contract.

Bottom Line?

Carefully examine what you tell users about your moderation
practices, and how they are applied to users. While the
constitutionality of HB 20 and similar laws is a decision for
another day, the increased scrutiny of platforms’ (both large
and small) content moderation practices is already forcing
platforms to make clearer acceptable use policies and publicly
clarify their practices. Such representations may ultimately become
the focus of UDAP investigations in states, something we’ve
already seen in Texas and Indiana. Just like any other public facing
policy – establishing clear and neutral rules of the road and
following those standards will help minimize AG attention to your
practices. And even if HB 20 doesn’t impact your business
– following this and similar cases will help to shed light on
how far future legislatures can go to regulate the practices of
platforms going forward.

The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.

POPULAR ARTICLES ON: Media, Telecoms, IT, Entertainment from United States

#AD: FTC Guidance On Companies’ Use Of Influencers

Fox Rothschild LLP

Social media has taken over, with social media users nearly doubling from 2.3 billion in 2016, to 4.2 billion in 2021.[1] Social media platforms provide direct access to consumers, with the ability…

FTC Endorsement Guides: Proposed Changes

Klein Moynihan Turco LLP

On May 24, 2022, the Federal Trade Commission (“FTC”) unanimously approved a notice of proposed amendments to its “Guides Concerning the Use of Endorsements and Testimonials in Advertising” (the “Endorsement Guides”).

Original Source link

Leave a Reply

Your email address will not be published.

+ forty three = 49