Facebook is ramping up its efforts to crack down on coordinated campaigns that use its site to spread fake news and misinformation to sway political opinion. In a white paper on Thursday (27 April), the social media giant said its platform has evolved into a forum for political debate, civic engagement, exchange of ideas and consumption of information in recent years.

However, the company says this comes with "novel challenges" that it must address to ensure its platform "remains a safe and secure environment for authentic civil engagement".

"We believe civic engagement is about more than just voting — it's about people connecting with their representatives, getting involved, sharing their voice, and holding their governments accountable," Facebook's threat intelligence manager Jen Weedon, threat intelligence analyst William Nuland and chief security officer Alex Stamos wrote.

"We have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people."

In a case study of the 2016 presidential election, Facebook revealed it found several instances of what it calls "information operations" — coordinated actions taken by actors, governments or non-state actors, in an attempt to sway political opinion to "achieve a strategic and/or political outcome."

These operations use various manipulative methods to do so including deploying fake news, networks of fake accounts dubbed "false amplifiers" and misinformation.

During the US election season, Facebook said it found malicious actors using conventional media and social media to share "information stolen from other sources, such as email accounts, with the intent of harming the reputation of specific political targets".

A separate set of malicious actors also used fake accounts to push certain narratives and themes that "reinforced or expanded on some of the topics exposed from stolen data". However, it said the reach of such content was marginal compared to the overall volume of political discussion on its site.

Facebook did not name any countries, the possible actors behind the attacks, actors who sponsored the activity or who the targets were. The company also noted that its findings do "not contradict" the US Director of National Intelligence report released on 6 January.

The US intelligence report claimed Russian President Vladimir Putin was behind a complex "influence campaign" designed to hurt Hillary Clinton's bid for the White House and help Donald Trump win the election.

Facebook versus Fake news

Facebook came under fire in the lead up to and after the US presidential election in November over the rampant spread of fake news, hoaxes and misinformation on its site. Some critics argued that the dissemination of false information on its platform may have helped Donald Trump win the presidency.

Mark Zuckerberg initially dismissed the idea as "a pretty crazy idea", but vowed to tackle fake news on the social media network. Over the past few months, Facebook has employed various measures to combat the issue including employing third-party fact-checkers, making it easier for users to report suspect faux content and disrupting financial incentives for hoax sites among other initiatives, tools and services.

Facebook is taking various steps to tackle these organised campaigns, categorised as targeted data collection, content creation and false amplification.

These include machine learning to identify patterns of suspicious behaviour such as repeatedly posting the same content. In France, Facebook said these measures have helped it combat more than 30,000 fake accounts as of 13 April.

It is also offering customisable security and privacy features including multiple options for two-factor authentication. Facebook will also notify specific people if they are targeted by sophisticated attackers with custom recommendations for what to do and proactively notify possible targets that may be at risk as well.

"These are complicated issues and our responses will constantly evolve, but we wanted to be transparent about our approach," the report read.