The Russians who posed as Americans on Facebook last year tried on quite an array of disguises. There was “Defend the 2nd,” a Facebook page for gun-rights supporters, festooned with firearms and tough rhetoric. There was a rainbow-hued page for gay rights activists, “LGBT United.” There was even a Facebook group for animal lovers with memes of adorable puppies that spread across the site with the help of paid ads, The New York Times reports:
Federal investigators and officials at Facebook now believe these groups and their pages were part of a highly coordinated disinformation campaign linked to the Internet Research Agency, a secretive company in St. Petersburg, Russia, known for spreading Kremlin-linked propaganda and fake news across the web. They were described to The New York Times by two people familiar with the social network and its ads who were not authorized to discuss them publicly.
Under intensifying pressure from Congress and growing public outcry, Facebook on Monday will turn over more than 3,000 of the Russia-linked advertisements from its site over to the Senate and House intelligence committees.
“We’re obviously deeply disturbed by this,” said Joel Kaplan, Facebook vice president of United States public policy. “The ads and accounts we found appeared to amplify divisive political issues across the political spectrum,” including gun rights, gay rights issues and the Black Lives Matter movement.
The European Union “could not take the issue of hybrid threat[s] more seriously,” said EU foreign-policy chief Federica Mogherini, speaking ahead of the October 2 inauguration of the new European Center of Excellence For Countering Hybrid Threats in Helsinki, RFE/RL reports.
But Russian propaganda is winning the information war, participants of a high-level working group told POLITICO:
Participants drew a distinction between Russian cyber attacks aimed at sabotage — such as changing election results or shutting down the electric grid — and those designed to sow confusion and dissent. Western governments may be better at combating the former; instances of the latter can still confound them.
“They’re going to run down our system to create doubts and divisions, promote any extremists, whether it’s white supremacists or, you know, radical leftists,” one participant said. “They don’t care, it’s just anything that disrupts and confuses and causes disarray within NATO and the EU.” The best way to combat Russian information warfare is with more information, participants said.
Concern about the proliferation of disinformation, misinformation, and propaganda has prompted many governments to propose new legislation, notes Kelly Born, a program officer at the William and Flora Hewlett Foundation’s Madison Initiative. But the solutions on offer often fail to account for at least six ways in which today’s disinformation and propaganda differ from yesterday’s, she writes for Project Syndicate:
- First, there is the democratization of information creation and distribution. As Rand Waltzman, formerly of the Defense Advanced Research Projects Agency, recently noted, any individual or group can now communicate with – and thereby influence – large numbers of others online. This has its benefits, but it also carries serious risks – beginning with the loss of journalistic standards of excellence, like those typically enforced within established media organizations. Without traditional institutional media gatekeepers, political discourse is no longer based on a common set of facts.
- The second feature of the digital information age – a direct byproduct of democratization – is information socialization. Rather than receiving our information directly from institutional gatekeepers, who, despite often-flawed execution, were fundamentally committed to meeting editorial standards, today we acquire it via peer-to-peer sharing….
- The third element of today’s information landscape is atomization – the divorce of individual news stories from brand or source. Previously, readers could easily distinguish between non-credible sources….And, as a recent study from the American Press Institute found, the original source of an article matters less to readers than who in their network shares the link.
- The fourth element that must inform the fight against disinformation is anonymity in information creation and distribution. Online news often lacks not only a brand, but also a byline. This obscures potential conflicts of interest, creates plausible deniability for state actors intervening in foreign information environments, and creates fertile ground for bots to thrive….
- Fifth, today’s information environment is characterized by personalization. Unlike their print, radio, or even television counterparts, Internet content creators can A/B test and adapt micro-targeted messages in real-time….
- The final element separating today’s information ecosystem from that of the past, as Stanford law professor Nate Persily has observed, is sovereignty. Unlike television, print, and radio, social-media platforms like Facebook or Twitter are self-regulating – and are not very good at it.
The Kremlin is already preparing for the US 2018 midterm elections, says Michael Carpenter, a former deputy assistant secretary of Defense for Russia, Ukraine, and Eurasia, foreign policy advisor to Vice President Biden, and NSC Director for Russia. Since the collapse of the Soviet Union, Russia’s intelligence services have beta-tested their covert influence operations at home and among their neighbors, he writes for The Hill:
Disinformation operations; covert financing of political candidates via murky business relationships; hacking and disclosure of sensitive information; threats to release sex tapes and other kompromat (material for blackmail); and even poisonings and assassinations have all been used as means of influencing the geopolitical trajectory of countries along Russia’s periphery.
Russians and other actors appreciate the considerable value of sustained disinformation campaigns, notes analyst Miah Hammond-Errey.
“The low cost and high effectiveness of these non-military measures combined with few counter-measures as well as strong drivers of change (such as automation) increasing their effectiveness, indicate their use will continue to increase,” she contends. “It means that public legitimacy and support can change dramatically and very quickly. It creates a more volatile and polarized operating environment.”