COUNTRIES AROUND the world are worrying, rightly, about Russian interference in their elections. But there’s another player in the game with a lot of experience, and it isn’t sitting this one out. China began dabbling in Internet disinformation long before “Russian troll” became part of the American vernacular, The Washington Post reports:
Its so-called 50-cent army, as many as 2 million strong according to some estimates, started marching in 2004: sneaking state-sponsored narratives into organic conversations, or just distracting citizens from controversial subjects, and pretending the 448 million comments it generates a year come from ordinary people.
These manipulation attempts [one element of what the National Endowment for Democracy calls ‘sharp power‘] have historically been mostly inward-facing, while efforts to sway foreigners have focused instead on overt propaganda disseminated through 3,000 public television channels, 2,500 radio stations, 12,000 newspapers and magazines, and more than 3 million websites. …The Stanford Internet Observatory has identified the upcoming Taiwanese presidential election as a near-certain target for a Chinese influence campaign.
China is hardly a stranger to waging war in cyberspace, but as the U.S. presidential campaign warms up, Beijing may now mean to open a new front. That can’t go unnoticed, or unchallenged, The Post argues. RTWT
Are Russia, China, Iran, and North Korea using cyber operations to coerce others? And if so, how? The findings of a new RAND report (above) show that espionage—not coercion—remains the predominant purpose of most cyber operations. But Russia and North Korea appear more likely to have used cyber operations as a coercive tool than China and Iran. Going forward, the United States and its partners should develop a richer understanding of cyber coercion—and how to respond to it, according to Fighting Shadows in the Dark: Understanding and Countering Coercion in Cyberspace.
RAND researchers Quentin E. Hodgson, Logan Ma, Krystyna Marcinek, and Karen Schwindt recommend two approaches to further our understanding:
- The first is to hold a series of tabletop exercises with regional and functional experts to explore scenarios where coercion might occur. The coercing state in this game would be given a set of tools, including cyber operations, which the team can choose to employ or not. The central idea is to use the game to explore conflict dynamics—not coercion specifically—to see whether and why the coercing state chooses to use cyber operations, and whether the other teams recognize the activity as intending to coerce.
- The second method is to use game theory to explore the dynamics in which a state benefits from employing cyber operations to coerce. Game theory provides a rigorous theoretical basis for exploring these issues.
The early signs of potential cyber coercion can look similar to other more- or less-malicious cyberactivity: scanning of networks; phishing emails; and perhaps social engineering, developed from scraping of websites for information, they add. This indicates that more work is needed to create indicators to provide warning of emerging cyber coercion. Read more »
To counter modern disinformation, we cannot focus solely on social media platforms or current technologies — we should also understand the psychological factors that underpin our identities and perceptions of the truth, according to Christina Nemr, Director of Park Advisors, overseeing counter-disinformation and countering violent extremism programming, and Will Gangware, who previously worked with the NYPD Intelligence Bureau.*
There are two constants in the current complex information environment: the use of propaganda and disinformation as tools for influence and obfuscation, and the underlying psychological factors that make humans vulnerable to such narratives. What is subject to change, however, are the technologies by which such content is created and spread, they write for War on The Rocks:
Given these factors, it is important not to overstate the impact of technology, but rather to understand and address the interwoven complexities disinformation poses. In the near term, social media platforms are best positioned to lead counter-disinformation efforts, and these efforts should be made as transparent as possible in collaboration with government and other partners. However, all stakeholders should approach it as the multi-dimensional problem that it is, contending honestly with the cognitive limitations of people’s information intake and the ambiguous nature of what constitutes disinformation. Only then will we craft effective policies, regardless of the technologies involved. RTWT
*Their article contains excerpts from a larger report on psychological and technological vulnerabilities to disinformation found here in Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age, an interdisciplinary review, commissioned by the United States Department of State’s Global Engagement Center. The report presents a holistic overview of the disinformation landscape by examining 1) psychological vulnerabilities to disinformation, 2) current foreign state-sponsored disinformation and propaganda efforts both abroad and in the United States, 3) social media companies’ efforts to counter disinformation, and 4) knowledge and technology gaps that remain.