Russian operatives and other foreign actors are deliberately targeting U.S. troops and veterans with online disinformation amplified on a massive scale, according to a leading veterans group, The Washington Post reports (HT:FDD).
But Russia’s disinformation campaign has nothing on China’s efforts in Taiwan, according to the Post’s :
Fake news and disinformation campaigns have become major concerns in Western democracies, notably Russia’s interference in the 2016 U.S. election. But analysts say they pale in comparison to China’s efforts to sow discord in Taiwan, a self-ruled island that Beijing has vowed to bring under its control. “China’s influence has penetrated into every corner of this country,” said Wu Jieh-min, a sociologist at Academia Sinica, Taiwan’s national academy. “Politics, the economy, society, culture, and religion.”
The Watchout civic group, which began as a congressional watchdog but has morphed into a crowdsourced cohort of fact-checkers, is one of the grass-roots groups tackling disinformation ahead of Saturday’s vote. These groups say they are acting as “fake news detergent” to scrub away the lies in Taiwan’s raucous political and media landscape, Fifield adds.
While the first wave of responses to the challenge of disinformation stressed the supply of disinformation, and emphasized fact-checking, debunking, and counter-narratives, a new analysis from the National Endowment for Democracy’s International Forum for Democratic Studies focuses on demand for disinformation. While some consumers are exposed to and perhaps influenced by disinformation incidentally, others repeatedly seek out and believe sources of disinformation while rejecting other information sources. Why?
It’s the psychology, stupid
The answer is tied in part to the psychology of news consumption and opinion formation, analysts Samuel Woolley and Katie Joseff contend in “Demand for Deceit: How the Way We Think Drives Disinformation.”*
Just because the effectiveness of disinformation may be tied to innate aspects of human psychology does not mean that democratic societies are powerless to respond, they suggest. Rather, civil society, journalists, and other stakeholders invested in the freedom and openness of the global information space should develop innovative adaptations to the contemporary, disinformation-rich information landscape by bearing in mind key insights from the “demand” side of this challenge:
- Passive and Active Demand for Disinformation. Demand for disinformation can be broadly split into two categories of psychological drivers: passive, or those requiring no conscious reasoning process on the part of the individual, and active, or those guided by an individual’s efforts to reach conclusions through cognitive processes. Across geographic contexts, deeply polarized societies with low trust in the media appear more susceptible to these drivers.
- Disinformation as a Global Phenomenon. Young and vulnerable democracies deserve greater sustained attention and research on these topics. Much of the research on disinformation from the fields of psychology and communications has focused on the impact in developed democracies. Disinformation is straining societies from Australia to Zimbabwe. More work is needed that accounts for this global context.
-
Accounting for Psychology in Fact Checking Initiatives. Fact-checkers face challenges in confronting demand for disinformation: news consumers who are invested in a particular political narrative may be more likely to reject corrective information and rationalize their preexisting beliefs. Continuing research aims to understand this challenge and help fact-checkers better communicate with difficult-to-persuade audiences.
- Mistrust vs. Media Literacy. Efforts to improve media literacy are similarly challenged, as news consumers who are heavily invested in false political narratives are often quite knowledgeable about (and skeptical toward) independent media. That said, media literacy programs are not all equal: the most effective take into account the demand-side drivers of disinformation.
- The Impact of Emerging Technologies on the Disinformation Crisis. Emerging technologies, including synthetic media, virtual and augmented reality, and biometric-powered mass surveillance have the potential to worsen the disinformation crisis in a number of ways. However, it is not only the sophistication of these technologies that poses the greatest challenge, but the interaction with the demand-side drivers discussed here.
Digital Disinformation Beyond Social Media
The problem of online disinformation is only getting worse, report co-author Woolley told a forum at Stanford’s Freeman Spogli Institute for International Studies. We will soon be navigating new technologies such as human-like automated voice systems, machine learning, ‘deep-fake’ AI-edited videos and images, interactive memes, virtual reality and augmented reality, he said.
Educators can play a key role in fostering youth agency to detect deepfakes and reduce their influence, says a leading analyst. One challenge is ensuring youth learn critical media literacy skills while they continue to explore valuable resources online and build their capacities and knowledge to participate in democratic structures, adds Assistant Professor and expert in Pedagogical Practices in Digital Contexts at Université Laval.
In Canada, Journalists for Human Rights announced a new program, funded by Heritage Canada, to train journalists and to enhance “citizen preparedness against online manipulation and misinformation.” she writes for The Conversation.
Following steps identified in her “Get Ready to Act Against Social Media Propaganda” model — beginning with explaining stances on a controversial issue targeted through social media propaganda — educators can help youth discuss how they perceive and recognize deepfakes, Naffi writes.
Although democratic societies may have underestimated the complexity that demand-side drivers pose, it is important not to panic, Woolley and Joseff argue. This challenge can be met, but doing so will require more research on behavioral change as it relates to digital disinformation, and more work illuminating why people spread novel forms of disinformation.
On the above TED Talk, Danielle Citron, professor at Boston University School of Law, discusses how deepfakes undermine truth and threaten democracy.
Tech Prom – the Annual Dinner of the Center for Democracy and Technology – will be held on April 23, 2020, at The Anthem in Washington, DC. The event will feature the most influential minds of the tech policy world, highlight pressing issues in the field and provide opportunities to connect and exchange views with a variety of attendees. Please contact development@cdt.org with any questions about this year’s or past years’ event.
*The report is part of an International Forum working paper series, examining the dynamics and impact of resurgent authoritarian influence in the era of globalization, including through networked transnational kleptocracy and manipulation of the global information space. For more on these themes, visit the Forum’s blog, Power 3.0: Understanding Modern Authoritarian Influence, and its related podcast.