Facebook Admits Its Platform Was Used to Spread Propaganda In the 2016 Election
Much has been made of the role that “fake news” played in the election of president Trump, and continues to play in elections across Europe. Facebook, for their part, has publicly acknowledged that their platform has indeed been exploited in attempts to sway public opinion, both here and abroad. They have vowed to take measures to prevent such occurrences in the future.
As part of the effort, they are changing their approach to security, branching out from protecting against more traditional methods of intrusion to targeting “more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people.” This is according to a new white paper released by Facebook last week which details the various subtle methods used by malicious actors in order to manipulate public opinion.
The paper introduces the concept of information operations, which it defines as systematic, concerted efforts by organizational actors to influence public sentiment, usually to achieve a specific policy goal.
The methods of circulation of these operations are vastly sophisticated, and go far beyond just posting false or misleading stories. They influence the dissemination of false information using networks of paid professionals and “useful idiots” who share the content, causing it to become viral. This is in addition to fake accounts (accounts created under a pseudonym) and “bots”, which are automated and can spread stories at exponential rates.
We All Have Our Limits
The report lays out various methods that are being used to combat this, implementing both human and software based solutions to tackle of problem of fact-checking. But the strategy is far more notable for what it doesn’t do, writes Russell Brandom of The Verge.
The most striking thing is what it leaves out: a strategy for combating the creation of false and malicious material at its source, and a sense of Facebook’s responsibility when genuine users share those links. As described in the report, almost all the important elements of disinformation campaigns are outside of Facebook’s control. When the campaigns do venture onto Facebook, the associated posts tend to behave the same way any piece of news or content would. And while similar campaigns continue across Europe, today’s report suggest there’s no easy fix for the problem — or at least not from Facebook.
While Facebook can and should take measures to prevent its platform being used to spread misinformation, the content is ultimately created elsewhere, making it very hard to control what is being shared. The social media giant’s real role in all of this is messaging. That is the true power of Facebook. It is above all a vehicle for amplifying a viewpoint, whatever that may be, making it ideal for “AstroTurf” organizing.
Facebook themselves acknowledge in the report that this is a fine line they are walking, and that there is no magic bullet solution to the problem. They could ultimately end up in the murky waters of trying to prove intent, such as would be the case with a story is clearly false, but being spread by individuals who believe that it is true. They have tried to stay out of this area before, but their hand is being forced this time.
It is unclear what can be done about this short of Facebook becoming akin to thought police who censor ideas and opinions, a slippery slope the company has thus far been reluctant to head down. They have introduced their Journalism Project, aimed at empowering citizens and journalists to become better producers and consumers of news, which is a step in the right direction.
No Easy Answers
The dichotomy they faces around this issue is high wire balancing act, and Facebook fully acknowledges the challenges they face.
We recognize that, in today’s information environment, social media plays a sizable role in facilitating communications – not only in times of civic events, such as elections, but in everyday expression. In some circumstances, however, we recognize that the risk of malicious actors seeking to use Facebook to mislead people or otherwise promote inauthentic communications can be higher.
The Rubicon has been crossed on the issue of fake news and social media, and there can be no turning back. No one knows yet what this means for a wide variety of issues from privacy and free speech, to the basic tenets of truth and reality. But the center can no longer hold. For this unbelievably gray area at the intersection of politics and social media, the future is now. The question is, are we ready?
The full Facebook White Paper is below: