Facebook CEO Mark Zuckerberg expressed contrition for allowing third-party apps to grab the data of its users without their permission and for being “too slow to spot and respond to Russian interference” during the U.S. election, according to his prepared remarks published by the House Energy and Commerce Committee, the Washington Post reports:
Zuckerberg plans to open his remarks with a familiar recitation of the social media platform’s ability to link far-flung people together but then pivot into an acknowledgement of Facebook’s increasingly visible dark side.
“It’s clear now that we didn’t do enough to prevent these tools from being used for harm as well,” Zuckerberg plans to tell lawmakers. “That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy.”
Facebook is under fire for a range of perceived offenses, including helping to spread disinformation and allowing Cambridge Analytica, the firm with ties to the Trump campaign, to access information from tens of millions of user profiles, the Wall Street Journal adds.
Improved public understanding of Facebook’s role in elections and democracy is the focus of a major new research initiative announced today. Research by an independent and diverse group of scholars will be supported by the funding consortium, comprising the William and Flora Hewlett Foundation, with the Alfred P. Sloan Foundation, Charles Koch Foundation, Democracy Fund, the John S. and James L. Knight Foundation, Laura and John Arnold Foundation, and the Omidyar Network:
Facebook will grant these scholars access to proprietary data that has met the company’s new, heightened security around user privacy. The committee will define research topics and invite the broader scholarly community to submit research proposals related to social media, elections and democracy, which will also be paid for by the funding consortium to ensure researchers’ independence.
“This agreement between Facebook, academia, and charitable funders will help fill important research gaps that are inhibiting our ability to realize the benefits of social media while managing its drawbacks,” said Kelly Born, the program officer at the Hewlett Foundation’s U.S. democracy initiative. “But it is just one piece of what is needed. We also need scholarly access to data from other platforms, with essential protections for user privacy, to help answer critical questions being asked of all tech leaders by the public and policymakers: What are citizens exposed to online, how does that affect democracy, and what can be done to improve it?”
“Russia employs sophisticated strategies deliberately designed to achieve objectives while falling below the target state’s threshold for a military response,” said outgoing National Security Advisor Lieutenant General H.R. McMaster (left).
“Tactics include infiltrating social media, spreading propaganda, weaponizing information, and using other forms of subversion and espionage…The Kremlin’s confidence is growing as its agents conduct their sustained campaigns to undermine our confidence in ourselves and in one another,” he said, the Atlantic Council’s @DFRLab reports.
The belated enthusiasm for exposing Russian manipulation underrates the scale of the problem, which isn’t just confined to the Russians, notes Anne Applebaum (right), a board member of the National Endowment for Democracy. Tactics first used on a large scale by the Russians, are also available to others — and not just other authoritarians. Openly, and legally, they are also used in Western democracies, she writes for the Washington Post:
Here’s the real challenge faced by all the major platforms: how to re-engineer them to make them more resistant to organizations that, like the Internet Research Agency, engage in what one tech executive calls “coordinated inauthentic activity,” ranging from the use of false names and the creation of false audiences to the publication of false stories and the creation of divisive narratives. Perhaps they will have to limit the use of anonymity, change the algorithms that ensure that the most sensational material spreads the fastest, or institute transparency around video editing tools, especially as these become more sophisticated.
The anatomy of hybrid warfare
The attempted murder of Sergei Skripal and his daughter Yulia is a classic example of a hybrid warfare operation, writes CEPA’s Edward Lucas:
A possible clue here is that footage of Mr. Skripal’s arrest, in 2004, was uploaded to the internet shortly before the murder attempt. The videos could only have come from inside the Russian intelligence system. Their easy availability helped nudge the narrative away from the main point – Mr. Skripal as the victim of attempted murder – and towards something that suited the Kremlin better: Mr. Skripal as a traitor who had been humiliatingly unmasked.
The spread of disinformation can be traced to growing legitimacy problems in many democracies, say analysts W Lance Bennett and Steven Livingston. Declining citizen confidence in institutions undermines the credibility of official information in the news and opens publics to alternative information sources, they write in The disinformation order: Disruptive communication and the decline of democratic institutions:
Those sources are often associated with both nationalist (primarily radical right) and foreign (commonly Russian) strategies to undermine institutional legitimacy and destabilize centre parties, governments and elections. The Brexit campaign in the United Kingdom and the election of Donald Trump in the United States are among the most prominent examples of disinformation campaigns intended to disrupt normal democratic order, but many other nations display signs of disinformation and democratic disruption.