Comments by "" (@DavidJ222) on "CNN reporter shows Trump supporter her debunked Facebook posts. See her reaction" video.
-
By pushing stories from a diverse body of outlets and posting material on different platforms, Kremlin propagandists adapted the concept of pre-propaganda in their efforts to interfere in the 2016 election, according to a recent study by researchers at the Center for Social Media and Politics at NYU.
The study investigated the online propaganda strategies of the Internet Research Agency (IRA), the Kremlin-linked “troll farm.” The U.S. Department of Justice accused the group of spreading disinformation online to interfere in the 2016 election, indicting 13 Russians it said were involved in the scheme.
The research focused on tweets by IRA trolls (accounts controlled by humans who masked their identities) about the 2016 election containing hyperlinks to political news stories, YouTube videos, and other content. More than 30 percent of the politics-related IRA tweets examined linked to external websites. Of these, about 10 percent linked to news stories, and 3 percent linked to YouTube videos. Trolls linked to conservative news sources (34 percent) more often than to liberal ones (24 percent) and skewed conservative in their sharing patterns over time.
This finding supports the theory—that the IRA tried to support the Trump campaign—and indicates that the IRA exploited social media platforms interconnected
ecosystem of links, shares, and likes to spread disinformation.
In sharing liberal and conservative stories alike, Russia tried to sow discord by playing both sides. It’s also possible that Russia was simultaneously trying to build an audience among moderates before luring them to the Republican side.
YouTube appears to have been a crucial part of the IRA’s cross-platform strategy. The trolls linked to the video-sharing platform more often than to most other external websites, sharing overwhelmingly conservative content (75 percent). And while the trolls cast a wide net when sharing news stories, they tightened their focus to a selection of mostly pro-Trump, pro-Republican YouTube videos.
Finally, the researchers tested for ideological consistency in troll behavior over time. That is, did the conservative trolls remain conservative, and the liberal trolls remain liberal, throughout the 2016 campaign? For the most part, the answer is yes. But here’s where it gets interesting: Trolls who mostly shared liberal news stories were more likely to cross ideological lines by also sharing conservative YouTube videos. Trolls who mostly shared conservative YouTube videos, on the other hand, rarely shared liberal ones.
This behavior points to the IRA’s use of pre-propaganda. The IRA may have shared news stories from diverse sources to build credibility and a broad audience, before dosing liberal and moderate users with conservative YouTube content. A propaganda campaign can target liberals, moderates, and conservatives, with an overall goal of helping the Republican campaign. The sheer amount of conservative content in the dataset suggests this was the case.
The researchers coined a term to describe what they found: cross-platform pre-propaganda, or pre-propaganda that exploits the interconnected nature of platforms. By using Twitter to get users onto YouTube, the IRA deployed a tactic that represented a degree of historical continuity in state-driven propaganda, and took advantage of social media as a means to lower costs, increase scale, and maintain the anonymity of covert campaigns.
When it comes to state propaganda, the major platforms don’t exist in a vacuum. Together, they provide a whole ecosystem for malicious actors to exploit.
Cue the reaction from Russian trolls in
T-minus 5 4 3 2 .......
7