Yuri Andropov, who directed the Soviet Union’s Committee for State Security – KGB — from 1967-1982, said this: Disinformation works like cocaine. If you sniff it once or twice it may not change your life. If you used it every day though, it will make you into an addict – a different man.
Not even Andropov could have envisioned the impact of an unbelievably potent three-part combination punch to the collective American conscience , composed of:
- An extended, government-mandated lockdown and quarantine, resulting in
- Greater-than-ever public time and attention devoted to information and especially to disinformation on social media, combined with
- The murder of an unarmed African American by a police officer in a major U.S. city, all captured on video.
Each part synergistically added to the others, creating a perfect storm in the aftermath of George Floyd’s murder in Minneapolis on May 25. A presidential election year tosses an additional caustic element in the mix. What follows here is a brief analysis of how it happened, concluding with an overview of some ongoing activities in the areas of Disinformation (DI) campaigns and Information Operations (IO).
Past posts on this blog site, going back to February, have cited the “infodemic” nature of Covid-19 and the multiple layers of disinformation and deceit associated with the epidemic. The World Health Organization, the International Fact Checking Network, First Draft News, the Shorenstein Center on Public Policy at Harvard University, and – eventually – mainstream news media outlets all reacted to the flood of disinformation taking place surrounding the Coronavirus pandemic. Some, such as First Draft News, developed regular briefings and seminars aimed at journalists attempting to explain, and debunk, some of the disinformation surrounding Covid-19. Facebook, Twitter, and other social media platforms more aggressively took down disinformation posts, especially those related to the Coronavirus.
A portion of the problem is us. Being in quarantine, we all spent significantly more time on social media. Tech Crunch reported that WhatsApp and Instagram both saw a 40 percent increase in usage, and Facebook reported a 37 percent increase in usage, as a result of Covid-19. Twitter reported an uptick of 24 percent in daily users in the early period of the Coronvirus crisis, according to Engaget.com.
Another huge jumped happened in the number of us viewing online videos especially among those watching videos on YouTube. WordStream reports that YouTube accounts for more than two-thirds of video consumption among millennials. YouTube is also the second most-used search engine, and according to Cisco, online videos now account for 79% of all online traffic. That’s a key: Especially among millennials, news equates to watching a video on their cell phones about the story on YouTube or some social media platform. Increasing amounts of time, now devoted daily, to social media-generated stories and videos has addicted some of us, changed us, and is changing our society in ways we have yet to understand. Additionally, the Coronvirus pandemic has had a disproportionately negative effect on African Americans in terms of health and economics. This is the “frame of reference” or starting point from which an objective researcher or writer would have to accept before beginning to analyze how elements of society reacted to what happened on May 25, 2020.
George Floyd’s murder sets DI campaigns, social media on fire
There is an abundance of supporting evidence. In a November 2018 paper titled Acting the Part: Examining Information Operations Within #BlackLivesMatter Discourse, researchers Ahmer Arif, Leo Steward and Kate Starbird conducted an analysis of known social media account emanating from the Internet Research Agency of St. Petersburg, Russia (abbreviated RU-IRA), and their 2016 Twitter tweets and retweets. Here is a link to the entire paper.
Below is what the researchers discovered (If color is in light red, taken verbatim from the research):
- Modeling the ‘anti-Police’ #BlackLivesMatter protestor: Each RU-IRA account that we examined in the left-leaning cluster connected their African-American identity to being a #BlackLivesMatter activist by tweeting extensively about police officers shooting unarmed African American men and women … These tweets frequently linked to stories from established media sources such as Fox News and the New York Times, but also alternative media sources, including conspiracy theory and RU-IRA affiliated sites….
- Nurturing Division: Enacting Caricatures of Political Partisan Accounts – Our findings show RU-IRA agents utilizing Twitter and other online platforms to infiltrate politically active online communities. Rather than transgressing community norms, these accounts undertook efforts to connect to the cultural narratives, stereotypes, and political positions of their imagined audiences.
- Taking a perspective based on the theory of structuration [footnote], the impact of these accounts cannot be considered in a simple cause and effect type model, but instead should be examined as a relationship of mutual shaping or resonance between the affordances of the online environment, the social structures and behaviors of the online crowd, and the improvised performances of agents that seek to leverage that crowd for political gain.
- Importantly, this activity did not limit itself to a single “side” of the online conversation. Instead, it opportunistically infiltrated both the politically left-leaning pro-#BlackLivesMatter community and the right-leaning anti-#BlackLivesMatter community. Though the tone of content shared varied across different accounts, in general these accounts took part in creating and/or amplifying divisive messages from their respective political camps. In some cases (e.g. @BleepThePolice), the account names and content shared reflected some of the most highly-charged and morally-questionable content (Emphasis added.) Together with the high-level dynamics revealed in the network graph (illustration), this observation suggests that RU-IRA operated-accounts were enacting harsh caricatures of political partisans that may have functioned both to pull like-minded accounts closer and to push accounts from the other “side” even further away. (Emphasis added.)
- Though we cannot quantify the impact of these strategies, our findings do support theories developed in the intelligence field that suggest one goal of specifically Russian (dis)information operations is to “sow division” within a target society [footnotes].
Here is a figure from the researchers’ findings. All the Twitter accounts listed here originated from the Internet Research Agency in Russia.
The three researchers concluded that Russian accounts, disguising themselves as U.S. based, presented themselves as “authentic” voices on both sides of a polarized online discourse, by presenting both pro- and anti-BlackLivesMatter agendas. They also concluded that these inauthentic accounts converged to undermine trust in information intermediaries like ‘the mainstream media’
Traveling further down the evidence trail
“Acting the Part…” was published 19 months ago. It took years of research and the efforts of FBI investigators with the Mueller Report to track down inauthentic social media accounts and to ascertain the full extent of Russia’s disinformation activities. Rather than restating ground here, interested persons can refer back to a video I produced in Fall 2019 summarizing just one dimension of this DI campaign, which fermented divisiveness all over the U.S. in 2016. Here is a link:
As of June 2020, here are additional facts which we should consider:
- Knowing that the U.S. and other nations were wise to its DI methods, Russia’s Yevgeny Prigozhin (perhaps the world’s No. 1 instigator of DI campaigns) changed strategy and is now “subcontracting” DI efforts by paying nationals in the country being targeted to do the social media work. A study from Stanford University’s Internet Observatory, titled “Evidence of Russia-linked influence operations in Africa,” chronicled how this was being done in nations in Africa. A link to the document is just below. Key findings include (again, verbatim is in light red):
- In addition to well-known social media platforms such as Twitter and Facebook, the actors leveraged public WhatsApp and Telegram groups.
- The operation used social media engagement tactics designed to develop a close relationship with the audience, including Facebook Live videos, Google Forms for feedback, and a contest.
- The operation shared tactical similarities to Internet Research Agency activities; the operatives created several associated news sites (in one case staffed by reporters who appear to have spent time in Russia) as well as Facebook Pages that produced social-first content (memes, live videos).
- 2. The Wall Street Journal reported in early June that during the late May period after Floyd’s murder, accounts originating in Pakistan and Botswana were posting “Facebook Live” videos that were purported to be of “real time” incidents of police violence in the United States. Some of these videos were viewed by 20 million or more people before Facebook removed the accounts from its service.
- 3. Researchers at the Australia Strategic Policy Institute (ASPI) have been tracking Information Operations (DI Campaigns) stemming from the People’s Republic of China. A recent report from the ASPI’s International Cyber Policy Center, titled Retweeting through the great firewall: A persistent and undeterred threat actor  Finds that the Chinese Communist Party is engaging in ongoing DI campaigns about the democracy movement in Hong Kong, Covid-19,and other matters. Of special interest is the ongoing research of the ASPI’s Elise Thomas. She is identifying individual Twitter accounts emanating from China from this DI effort, and pointing out how these accounts have recently changed their narrative and are now disseminating blasts of information daily in support of Black Lives Matter and in opposition to police in the U.S. (Here is a link to the research).
ADD IT UP: Research from multiple respected universities and institutes and prominent governmental leaders of both the Republican and Democratic parties concur that Russian DI strategy is fueling dissention and ferment for and against Black Lives Matter. Researchers in Australia who are carefully studying social media activity emanating from China add that the Chinese Communist Party is also involved. Forces from these hostile foreign powers are dividing and weakening the United States by actively running DI campaigns targeting U.S. citizens in the aftermath of George Floyd’s murder.
“Threat actors” is the term which the Department of Homeland Security’s Analyst Exchange Program uses to describe DI campaigns which co-opt circumstances with evil intent. Those threat actors have helped to fuel and incite violence within a good and noble effort, the just cause for racial equality.
Andropov was more right than he knew. But this time it is an entire country which seems to becoming addicted to social media and is falling sway to DI, disinformation.
RECENT DISINFORMATION DEVELOPMENTS
Kudos to Facebook on Friday for deciding to take additional measures against hate speech in advance of the November 2020 U.S. elections. For years now, FB has uncovered and removed “Coordinated Inauthentic Behavior” from its platforms. FB says in its June 26, 2020, release that “now we identify almost 90% of the hate speech we remove before anyone even reports it to us.” Facebook is striving mightily to balance free speech rights without becoming arbitrary and capricious. See for yourself. Here’s a link to its June 26 announcement: https://about.fb.com/news/2020/06/meeting-unique-elections-challenges/
Kudos also to Jane Lytvynenko, BuzzFeed senior reporter with focus on disinfo and online investigations, who (among her other reporting duties) also devotes significant time and energy to outing inauthentic posts on social media.
Well worth your $$$: Thomas Rid and Nina Jankowicz have both written books about Russia and Disinformation. Rid’s “Active Measures” and Jankowicz’s “How to Lose the Information War” are must reads for those studying Russia in 2020.
Finally – Advice to Twitter CEO Jack Dorsey: Last time we checked, there is more than one person running for the office of President of the U.S. in 2020. If you want to provide a beneficial service to Twitter users and also help avoid the risk of losing Section 230 protections under the Communications Decency Act, why not assign fact checkers to ALL candidates for President? Take it even a step further, and ask universities specializing in public policy to assign a person from their research staffs to do the fact checking. That’s the smarter, less divisive way to proceed.
I will be engaged in academic research for the next several weeks, and will most likely not be making additional posts to this spot until late August or early September 2020. I do respond to emails with requests for fall speaking engagements at firstname.lastname@example.org
 Perez, Sarah https://techcrunch.com/2020/03/26/report-whatsapp-has-seen-a-40-increase-in-usage-due-to-covid-19-pandemic/
 Cooper, Daniel https://www.engadget.com/twitter-q1-2020-112603005.html
 Bond, Conor https://www.wordstream.com/blog/ws/2020/02/27/youtube-statistics
 Published in Proceedings of the ACM ACM on Human-Computer Interaction, Vol. 2, No. CSCW, Article 20, November 2018.
 Grossman, Shelby; Bush, Daniel; and DiResta, Renée, Evidence of Russia-linked influence operations in Africa. November 2019
 Horwitz, Jeff https://www.wsj.com/articles/live-facebook-protest-videos-drew-millions-of-views-but-some-footage-was-years-old-11591118628, Wall Street Journal, June 2, 2020
 Wallis, Jake; Uren, Tom; Thomas, Elise; Zhang, Albert; Hoffman, Samantha; Li, Lin; Pascoe, Alex: and Cave, Danielle, ASPI Policy Brief Report No. 33/2020
2 thoughts on “Disinformation, BLM, and Social Media”
John, Thank you for this post and teaching us about disinformation and what we can do about it.
Thank you Kathy!