WASHINGTON — Russia and Iran were the leading purveyors of disinformation on Facebook over the past four years, and the American public was the top target, according to a new report by Facebook summing up the social media network’s efforts to purge itself of propaganda.
Facebook says it shut down 150 networks of fake accounts between 2017 and the end of 2020 — many of them foreign disinformation efforts aimed at influencing Americans, others created in the U.S. by domestic extremists. Facebook reports having 2.85 billion users around the world.
“Influence operations are not new, but over the past several years they have burst into global public consciousness,” the report says. “These campaigns attempt to undermine trust in civic institutions and corrupt public debate by exploiting the same digital tools that have diversified the online public square and empowered critical discussions from Me Too to the Black Lives Matter movements.”
Facebook’s Nathaniel Gleicher, who heads cybersecurity policy, says the company has come a long way since Russian intelligence was able to carry out a widespread operation in 2016 using fake accounts in an effort to influence the 2016 presidential election.
“I would never say we caught everything, we’ve got a pretty good and improving record of catching, in particular, the operations that are getting more attention,” he said.
The report does not cover the post-election misinformation campaign that paved the way for the Jan. 6 capital riots, which did not involve fake accounts. But Facebook has been criticized for being slow to address that content on its platform. Facebook banned “Stop the Steal” content 69 days after the election — well after much of the damage was done, critics say.
Facebook said in a statement, “We took a number of steps to limit content that sought to delegitimize the election, including indefinitely suspending President Trump from our platform, labeling candidates’ posts with the latest vote-counting information after President Trump prematurely declared victory, and removing violating content including the original #StopTheSteal Group.”
In terms of volume, networks originating in Russia and Iran comprised the largest share of what Facebook calls “Coordinated Inauthentic Behavior,” the report says, with 27 Russian and 23 Iranian networks shut down. Facebook doesn’t attribute the campaigns to governments, but many of the Russian and Iranian campaigns had all the hallmarks of intelligence influence operations, private researchers said.
Russia and other foreign actors are getting better at blurring the lines between foreign and domestic activity by co-opting unwitting domestic groups to amplify narratives designed to divide Americans, the report says.
For example, in July 2018, Facebook removed a network linked to the Russian Internet Research Agency, the Kremlin-linked troll farm that was so prolific with fake accounts in the 2016 election. The network was promoting locally organized events focused on hot-button issues, such as anti-Fascism.
In some cases the U.S. government labeled them as such, as when it called out what it said was an Iranian effort in 2020 to pose as members of the Proud Boys and threaten Florida voters.
But many of the campaigns that appear to benefit foreign governments target their own citizens, Facebook says. Myanmar, for example, was the third most common source of coordinated propaganda campaigns as the government cracked down on the Rohingya ethnic group. Ukraine was fifth.
The United States ranked fourth as a host from which the bogus networks emanated, Facebook says, noting that it shut down three conspiracy theory networks, two PR campaigns and two foreign media campaigns.
One of the U.S. networks was operated by Rally Forge, a marketing firm whose clients including the pro-Trump group Turning Point USA, Facebook said.
Organizers hired a staff of teenagers to run fake and duplicate accounts posing as unaffiliated voters—some appearing to be left-leaning — to comment on various Facebook pages in the run-up to the 2018 midterms, Facebook said.
“This particular case raises questions about the boundaries between acceptable political
discourse and abusive deception,” the Facebook report says. “While in the physical world it is not uncommon to pay people to knock on doors and advocate for a particular position, the implications and potential harm are very different when people are hired to do the same using fake accounts online.”
While serious concerns have been raised about the rise of Deep Fake videos, Gleicher said Facebook has not seen much use of then in influence campaigns, perhaps because they tend to amount to an expensive overkill when simple deceptive editing will work just as well, he said.
The report concludes that large scale propaganda campaigns “are now harder to pull off, more expensive, and less likely to succeed.”
But the would-be propagandists are growing more sophisticated, the report adds, by co-opting legitimate Facebook users.
“As threat actors evade enforcement by co-opting witting and unwitting people to blur the lines between authentic domestic discourse and manipulation, it will get harder to discern what is and isn’t part of a deceptive influence campaign,” the report says. “Going forward, as more domestic campaigns push the boundaries of enforcement across platforms, we should expect policy calls to get harder.”
Graham Brookie, the director of the Digital Forensic Research Lab at the nonprofit think tank the Atlantic Council, said that Facebook’s report on online propaganda on its own site provides a frame of reference for how governments use it elsewhere.
“Facebook has invested a lot of money in building up muscle memory to consistently disclose this type of behavior,” Brookie said. “It’s a large company that has other systemic vulnerabilities that are unaccounted for, but this is one where they’ve had leadership on.
State-sponsored disinformation “is truly a threat that reaches across the entire information landscape, and that’s not just social media platforms, it’s whatever humans connect and engage and inform each other,” Bookie said.