The Australian High Court ruled in September 2021 that media companies like newspapers and TV stations were liable for defamatory third-party comments on their public Facebook pages. The impact of the decision has been felt far and wide. Australian politicians turned off the commenting function on their own Facebook pages to avoid liability, while the Australian government began drafting legislation to make social media companies themselves legally responsible for harmful user posts.
The ruling was connected to a defamation suit brought by a former youth detention center inmate against major Australian media outlets in 2017 over user comments on the outlet’s Facebook pages. The problem began with one TV station broadcasting images of the man in a restraint chair with a hood covering his face while he was a detainee in 2016. Several other news outlets then published the images and put them on their Facebook pages. The scenes sparked a furious debate in Australia about conditions at its youth detention centers. But it also unleashed a flood of online abuse against the man.
For example, the comments on a news piece about the man posted by one media firm almost uniformly blamed him for his treatment, and some were outright lies, including that he had raped an elderly woman. The man filed a damages suit against three Australian media giants for defamation for not deleting these comments from their Facebook pages.
In 2019, the New South Wales Supreme Court ruled that liability for the defamatory comments lay with the media companies that posted the news stories to begin with. The Court of Appeal upheld the decision, prompting the media firms to turn to the High Court, Australia’s most senior bench.
The companies stressed throughout that, to take responsibility for the comments as news media, they would have to know the content and tone of those comments in advance. And therefore, they could not be liable for comments posted to their Facebook pages by third parties. Facebook now allows page owners to turn commenting on and off, but that function did not exist in 2016.
The September 2021 ruling, however, stated that “the Court of Appeal was correct to hold that the acts of the appellants (the media companies) in facilitating, encouraging and thereby assisting the posting of comments by the third-party Facebook users rendered them publishers of those comments.” A later passage declares, “Having taken action to secure the commercial benefit of (Facebook’s) functionality, the appellants bear the legal consequences.”
In response, U.S.-based news channel CNN made its public Facebook page inaccessible from Australia. Australia’s Guardian newspaper disabled commenting on nearly all its posted content, even its cooking articles.
Politicians active on Facebook also shut down comments on their public pages. Peter Gutwein, the premier of the Australian state of Tasmania, was one of them. In a Sept. 24, 2021 Facebook post, Gutwein wrote, “We know social media is a 24/7 medium, however, our moderation capabilities are not. As a result, there will be some changes to how users can interact with this page going forward.”
In late November, Prime Minister Scott Morrison revealed that his government was working on legislation that would hold social media companies responsible for defamatory content posted by anonymous users, if the firms did not reveal those users’ identities. The draft bill also stipulates that the owner of a Facebook page cannot be held responsible for user comments on that page. However, how to enforce the law against U.S.-based Facebook remains a problem.
John Middleton, a professor of comparative media law at Hitotsubashi University in Tokyo, noted that the High Court decision called for media outlets not to just let problematic comments sit on their social media pages. This, Middleton said, could prove tough for smaller media without the resources to employ enough staff to monitor every user comment, but the court’s groundbreaking judgment could help defamation victims.
David Rolph, a libel expert at the University of Sydney Law School, pointed out that while the internet is borderless, laws differ by country, and these laws fail to meet the legal challenges of the online world. He added that first and foremost, there needs to be a framework to provide the identity of the person behind any online comment, and to make them legally responsible for anything they post. The internet cannot be allowed to remain a lawless space, Rolph said.
How to deal with malicious social media posts has also become a societal problem in Japan. In response, revisions to Japan’s Provider Liability Limitation Act that will make it easier to pursue court action to reveal an anonymous user’s identity will enter force by autumn 2022. The government is also considering imposing prison terms for criminal defamation.
Yahoo Japan, the country’s largest internet portal site, announced in October last year that it would hide the user comment sections on Yahoo News articles if the number of problematic posts reached a certain number — one example of private firms taking the lead on anti-trolling measures.
Kenta Yamada, a Senshu University expert on free speech and the law, told the Mainichi Shimbun, “As forward-looking efforts to help victims, all this is praiseworthy. But if laws are tightened at the same time as private companies move to self-regulate their platforms, this could also very quickly narrow the space for free expression. The Japanese media has a history of self-regulation, and social media firms must fulfill their social responsibilities, so there needs to be a gradual accumulation of proactive efforts.”
(Japanese original by Epo Ishiyama, Asia General Bureau, Kim Suyeong, Foreign News Department, and Ken Aoshima, Tokyo City News Department)