A wave of misinformation that had a strong influence on the Wellington protest has prompted renewed calls for regulation of social media giants.
From claims Covid-19 is a ‘plandemic’ to the belief concrete bollards placed around Parliament were emitting electromagnetic radiation, fringe ideologies espoused during the three-week Parliament occupation largely spread online through platforms like Facebook, Twitter and Telegram.
Stephen Judd of conspiracy debunkers FACT Aotearoa says some involved are promoting “completely different media and information universes”.
One stark example happened during the blaze on Parliament’s lawn on March 2. As the fire burned, Facebook influencer Chantelle Baker told her 96,000 followers – without any evidence – that it had been started by police.
“You realise police pushed over a generator… they set a tent on fire, so the police caused this”, Baker said to a crew she identified as ‘mainstream’ media.
“So I hope you guys get that, I hope you don’t say it was protesters when it was police that caused this fire.”
READ MORE: Misinformation: Down the rabbit hole, and back
Minutes later, Baker came across a protester lighting a separate blaze and told them to stop, but continued to blame the police.
“The police set the fire and they can try and get it out,” she said.
The next day, Baker issued a retraction via yet another livestream.
“I’m happy to be wrong – it doesn’t worry me in the slightest because we’re live.”
But by then it was too late – the misinformation had spread far and wide.
“Of course the cops started the fire,” one Facebook user wrote. “Classic tactic then blame it on the people and make them look bad.”
“These were plain clothes cops, not protesters, see the live from Chanelle (sic) Baker,” another said.
‘It’s like a virus’
Dozens of others have repeated the same false claim since. Tech commentator Paul Brislen says they can spread incredibly quickly on social media.
“If one person shares something, it goes out to their friends; if they all share it as well, suddenly you’re reaching an audience of tens of thousands if not hundreds of thousands of people… It’s kind of like a pyramid effect… it can go absolutely crazy and that’s where going viral comes from – it’s like a virus.”
He says the algorithms of platforms like Facebook are designed to ” show you more of what it is that you like”, even if it’s harmful.
“If you’ve clicked on a video, a cat video for example, a cat video page will invite you to join the page. You’re being fed more information that enforces your beliefs whether they’re accurate or not. That’s the real danger here.”
In early 2021 a Classification Office survey found 19% of Kiwis held three or more beliefs associated with misinformation, a statistic Chief Censor David Shanks says has “almost certainly” gotten worse recently.
“We’re seeing an increase in the number of bad actors who have learned how to use digital platforms to spread their distrust of public institutions and the media, that means they create followers who really only believe what they say.”
He says the Government needs to push for tighter regulations of platforms that promote misinformation, like what happened after the Christchurch terror attack.
“Regulating the internet, that’s something New Zealand alone can’t achieve. If you look at the Christchurch Call post the horrific attacks on March 15, New Zealand was a leader in terms of a collaborative, multinational, multi-stakeholder approach to make a meaningful difference to the level of extremism and terrorism content online.”
He says while there’s been criticism changes haven’t come about fast enough, significant moves have been made.
“The big tech platforms who are members of the Call have updated their terms of service to prohibit terrorist and violent extremist content, and have improved their detection capabilities and user reporting mechanisms, while also putting in place stricter policies around livestreaming…
“The Christchurch Call continues to grow in global membership. It now includes 55 governments, most of the world’s liberal democracies, and crucially the US has now become a member. That gives us an idea of how we might participate in a global conversion to address this issue.”
Law playing catch-up
FACT’s Judd would like to see misinformation propagators’ accounts shut down. Anti-vaccine group Voices for Freedom’s Facebook page was removed last year for spreading false information. The group made another in March but it was swiftly shut down again, though they can still post on Instagram, Twitter and TikTok.
“If people who are spreading misinformation are prevented from using mainstream platforms like Facebook, they may go elsewhere,” Judd says. “But the good thing about that is that they may be harder to find, which means they have to work harder to get a platform for their ideas. So even there that can have a real effect.”
Globally, the law is catching up with some of the worst misinformation spreaders. Prominent US conspiracy theorist Alex Jones is currently dealing with multiple lawsuits after losing defamation cases over his claims the Sandy Hook school shooting was faked. Jones failed to show up to recent deposition hearings and will be fined $25,000 to $50,000 per weekday until he appears.
We could see court action here too. In February NZ-based outlet Counterspin Media promoted a so-called ‘documentary’ that questions whether the murder of 51 people during the Christchurch terror attack was staged. The documentary has been classed as objectionable and could see those sharing it imprisoned for up to 14 years.
Shanks says dealing with misinformation is a “delicate balance”.
“We can’t be cutting across people’s freedoms or reducing human rights, but the bottom line is it’s really clear some people are propagating extremist ideas and ultimately terrorist ideas.
“This problem gets worse if we don’t do something to address it. It’s a big problem, but it’s something we must tackle.”