Ways to neutralize Russia’s disinformation (at least partially)

     

 

YouTube videos of police beatings on American streets. A widely circulated internet hoax about Muslim men in Michigan collecting welfare for multiple wives. A local news story about two veterans brutally mugged on a freezing winter night, The New York Times reports:

All of these were recorded, posted or written by Americans. Yet all ended up becoming grist for a network of Facebook pages linked to a shadowy Russian company that has carried out propaganda campaigns for the Kremlin, and which is now believed to be at the center of a far-reaching Russian program to influence the 2016 presidential election. A New York Times examination of hundreds of those posts shows that one of the most powerful weapons that Russian agents used to reshape American politics was the anger, passion and misinformation that real Americans were broadcasting across social media platforms.

“This is cultural hacking,” said Jonathan Albright, research director at Columbia University’s Tow Center for Digital Journalism. “They are using systems that were already set up by these platforms to increase engagement. They’re feeding outrage — and it’s easy to do, because outrage and emotion is how people share.”

Russian disinformation campaigns of the present and the past are the subjects of an NPR interview with Anne Applebaum, who co-directs Arena, a new think tank based at the London School of Economics devoted to analyzing, reporting on and combating disinformation. Her new book, “Red Famine: Stalin’s War On Ukraine,” shows how, with the help of disinformation, Stalin orchestrated a famine in Ukraine in the early 1930s that resulted in the deaths of nearly 4 million Ukrainians.

LSE Arena has worked together with the Institute for Strategic Dialogue to monitor online influence in the German elections and its research has been published extensively in German media. Arena Directors Peter Pomerantsev and Applebaum gave an overview of the analysis in an exclusive essay for Die Welt, comparing the situation with alienated social media ‘echo chambers’ in Germany to the US.

A unique investigation with Buzzfeed unearthed a bot-net operating out of the Russian city of Nizhny Novgorod, whose managers confirmed they were supporting the German right-nationalist AFD Party free of charge This investigation has been turned into a multimedia ‘Ultimate Guide to Bust Fake Tweeters’ [see above],useful for any journalists exploring hidden influencers. The guide is featured in English on fact-checking site Poynter.org, and broadcaster ZDF in German.

Arena and ISD also worked together with Der Spiegel Magazine to look at how the international alt-right tried to influence the election. Other pieces included an in-depth profile of a prolific ‘internet warrior’ and the political dynamics among the 3-4 million Russian-Germans. A summary of Arena’s findings is available here

“What we’ve seen so far is the tip of the iceberg,” said Applebaum (right), a board member of the National Endowment for Democracy (one of the groups Russian internet censors are blocking as ‘undesirables’).

Euromaidan Press

There are ways to at least partially neutralize Russia’s disinformation, argues Mason Richey, associate professor of international politics at Hankuk University of Foreign Studies in Seoul, South Korea. The most important is to use strategic communications to generate salient, pro-active narratives describing actions and interests vis-à-vis Russia and third parties, he writes for International Politics and Society:

  • Responding piecemeal to Russian disinformation is a losing strategy. Instead, policymakers in Europe and the US must anticipate and pre-empt Russian information warfare lines-of-attack. This requires agencies to coordinate government and private sector communications. Examples are the EU’s Eastern Stratcom Task Force and Centre of Excellence for Countering Hybrid Threads, which must be scaled-up and better resourced.
  • Targeted polities should also incentivise action from the private sector, notably social media providers used to propagate ‘fake news’. A government heavy-hand – censorship for example – would backfire, playing into Moscow’s hands. Instead, media regulators should persuade social media providers – Twitter, Google, Facebook and the like – to create better filtering algorithms and the means  to identify ‘fake news’ and possibly remove it…..
  • Passive cyber-defence is insufficient; solutions must be innovative. One approach is ‘cyber-blurring’ –  creating fake documents and email accounts – to confuse and slow hackers. Emanuel Macron’s campaign successfully adopted this approach. A more aggressive approach is ‘active cyber-defence’. This goes beyond reactive cyber-security  such as anti-malware software, instead privileging defensive intelligence collection and policy measures, including sanctions and trade remedies, for deterring malicious cyber-actors. RTWT

Federica Mogherini now claims that the EU is using sufficient means and resources and cannot take the Russian threat more seriously, notes Kremlin Watch:

As we stated in the Open Letter of European security experts, the EEAS East STRATCOM Team must be at least tripled in size and receive a budget in the single millions of EUR, so it can start fulfilling its mandate in real terms. Only three experts in the EEAS East STRATCOM Team are tasked with countering pro-Kremlin disinformation, which means that Federica Mogherini still doesn’t take this threat seriously.

Detailed public scrutiny of RT’s activities, as well as those of other Russian media and proxies, must start happening now, the group adds.

John W. Kelly, the founder of Graphika, a commercial analytics company in New York, said the Russians appeared to have a consistent strategy across different platforms. Graphika has tracked thousands of social media accounts whose content closely tracks Russian information operations, promoting articles and videos about WikiLeaks dumps of stolen emails and “false flag” conspiracies about Syrian chemical weapons, The Times adds:

The Russian-influenced networks frequently promote obscure conservative YouTube channels such as the Next News Network and the Trump Breaking News Network, driving up their views and advertising revenue. A video posted in February by a conservative internet radio host, who claimed that 30 politicians were about to be arrested in connection with the “Pizzagate” hoax, racked up more than 300,000 views on YouTube. Another YouTube video, claiming that Michelle Obama had 214 personal assistants and had purchased four yachts with taxpayer money, had close to a million views.  Rather than construct fake grass-roots support behind their ideas — the public relations strategy known as “Astroturfing” — the Russians sought to cultivate and influence real political movements, Mr. Kelly said.

“It isn’t Astroturfing — they’re throwing seeds and fertilizer onto social media,” said Mr. Kelly. “You want to grow it, and infiltrate it so you can shape it a little bit.”

Print Friendly, PDF & Email