In its October 2019 report, the Analytic Exchange Project (AEP) calls us – Jane Average and Joe Average Americans – possible unwitting actors in Disinformation (DI) campaigns. With specificity, AEP is describing people who see something on social media and then pass it along on their Facebook, Twitter, Instagram and other accounts. They don’t examine the source of the information carefully, usually because it matches something which they believe.
Here’s a recent example: There is a satirical “news” site called The Babylon Bee. Ponder this title for a few seconds, since the city of Babylon no longer exists. Its ruins are about 60 miles southwest of Baghdad. If one does a basic “Google” or “Safari” search on it, we find that the Babylon Bee is a website comprised solely of satire. Thinks “News Update” from Saturday Night Live when you see anything from the Babylon Bee, The Onion, or other satirical news sites.
Just before Christmas (Dec. 25), The Babylon Bee made up a story which “reported” that Donald Trump has done more for Christianity than Jesus. Here are a few sentences from the story: “Look what I’ve done,” he (Trump) said. “You can say ‘Merry Christmas’ now. In fact, if you say ‘Happy Holidays’ and don’t immediately make it clear you’re referring to Christmas, you go to prison.”
Anyone who’s beyond the fourth-grade reading comprehension level should be able to figure out the narrative is satirical in nature, don’t you think? Here’s a link to the original post.
Yet this past week, I’ve noticed numerous present and former elected officials in Northeast Ohio sharing this story from The Babylon Bee on their social media sites, frequently with along with a lengthy commentary about how terrible of a President and person Donald Trump is.
Of course the commentaries were all the same, virtually word for word, as if they were a “cut and paste” from a master source.
STRONG SUGGESTION: If you see a satirical story appearing on a friend’s social media account, containing a long opinion post about the subject of the satire, point it out! Post onto their account that the story is made up, and that they shouldn’t be using it to comment upon or criticize others. Most of us don’t want to be unwitting actors in DI campaigns. Don’t be spread false stories, even satirical ones, irrespective of your political bias.
Lots of fake “screenshots” of supposed news stories still circulate on Facebook, Twitter and elsewhere on social media. Susan Benkelman of the American Press Institute wrote about it, and it appears on the weekly column of the International Fact Checking Network in this post.
KEY TAKEAWAY: People need to be reminded that because something looks like a legitimate screenshot doesn’t mean it is one. That may seem obvious in today’s environment, but as long as people keep falling for fakes, it’s worth repeating. (See bottom of this for more proof.)
Evelyn Douek is an S.J.D. Candidate at Harvard University who’s studying Digital Constitutionalism and Techlash. Writing for the Knight First Amendment Institute at Columbia University, Douek chronicles the creation, growth, operation of global “content cartels” (her description) and vital need for transparency and accountability in industry-wide content removal decisions. It’s an excellent and insightful paper.
KEY TAKEWAY: One of my favorite questions anytime a student suggests that “the government” should step in to regulate social media communication is Who Decides Who Decides? In a similar vein, Douek posits this: “Those concerned with monopoly power over public discourse should similarly be concerned about the rise of content cartels. But is it possible to keep the baby of helpful collaboration and throw out the bathwater of harmful cartels? In some areas and for some problems, platforms working together can be beneficial. But in which areas and how platforms collaborate is as important as that they do.”
Kudos to Facebook and its cybersecurity team, headed by Nathaniel Gleicher, for their ongoing efforts to remove DI campaigns and their harmful effects from the Internet. Last week FB security announced it removed three unconnected networks and groups of malicious actors from its accounts.
From the release: “The first operation originated in Russia and primarily targeted Ukraine and its neighboring countries. The second originated in Iran and focused mainly on the US. The third network originated in Myanmar and Vietnam and targeted audiences in Myanmar. Each of them created networks of accounts to mislead others about who they were and what they were doing.” Here is the link:
Finally, kudos also to First Draft for its ongoing full-day Live Simulations, training and immersion events which are helping journalist and journalism students to build core competencies in identifying disinformation and other online threats. Their next sessions are in El Paso, TX (2/21), Austin, TX (2/22) and Athens, OH (3/2).
Claire Wardle, Ph.D., founder and co-director of First Draft, has been addressing the problem of DI and DI campaigns for several years now. Here is a link to their web site. From there you can learn more and sign up for the Live Simulations.
FINALLY: Heed Mr. Lincoln’s advice.