PART ONE: A brief analysis of the People’s Republic of China’s Three Warfares, and how it conducts disinformation (DI) campaigns and media warfare

Lost amid endless speculation in social media about the outbreak of the Coronavirus (Covid-19) is a simple reality: The Chinese Communist Party (CCP), which runs the government of the People’s Republic of China (PRC) is in Year 17 of its San Zhan, or “Three Warfares” strategy to gain long-term dominance in the world. China’s leadership has self identified public opinion and media warfare (its terms) as a key activities in the second of these three warfare areas. Those three areas are:

  1. Legal
  2. Public Opinion
  3. Psychological
A medical worker takes a swab sample from a resident to be tested for the CCP virus in Wuhan, China
on May 15, 2020. (STR/AFP via Getty Images) This appears in The Epoch Times, Dec. 2, 2020

A critical component of the second area, public opinion, is Disinformation (DI). The PRC employs DI campaigns extensively in an effort, sometimes successful, to shape and mold public opinion across the globe in its favor. The truth is irrelevant in China. Social media is used as both a tool to keep the populace misinformed, to suppress opposition, and to shape internal opinion in a way deemed most beneficial to the CCP. Externally, the PRC employs a vast army of millions of social media campaigners and propagandists, some disguised as journalists working abroad, to tell its story.

Media in the U.S. and elsewhere has caught on to some aspects of this. Researchers at Stanford University and the University of San Diego concluded that the CCP puts out nearly 450 million fake social media posts a year.[1] Much to its credit, the New York Times has at times aggressively covered Chinese DI campaigns. The media outlet reported on this in August and September 2019 when Twitter, Facebook, and YouTube removed Chinese social media accounts that were linked to China, including taking action to eliminate 1,000 and suspend 200,000 more Twitter accounts. Ironically, China prohibits Twitter in its country but makes extensive use of Twitter in DI campaigns aimed at other countries and especially at Chinese living abroad. In Beijing alone, China has two million people working on propaganda and DI campaigns. “The end goal is to control the conversation,” Matt Schrader, a China analyst with the Alliance for Securing Democracy at the German Marshall Fund in Washington, told the Times reporters.[2]

This is the first in a multi-part series about China and DI campaigns. Today’s portion provides important background and covers some of China’s practices prior to the Covid-19 outbreak. The next part will deal exclusively with the Coronavirus and how the CCP has responded to this crisis, lied about aspects of it, and attempted to propagandize it into pro-Chinese and anti-Western themes.


Stefan Halper is a professor emeritus at the University of Cambridge, where he was once Director of American Studies in the Department of Politics and International Studies. There he lectured on latter 20th Century US foreign policy, China, and contemporary international security issues. Halper holds doctorates from Oxford and Cambridge. He is a Life Fellow of Magdalene College, Cambridge. Halper has served four American presidents in the White House and Department of State.[3]

More recently, Halper became entangled in the FBI’s “Operation Crossfire Hurricane” scandal, allegedly helping conduct government surveillance of the Trump campaign in 2016. While noteworthy, Halper’s involvement in these activities took place three years AFTER he had prepared and submitted a 559-page report, titled China: The Three Warfares for the Office of Net Assessment. 

What is the Office of Net Assessment? Created in the early 1970s, it is an independent organization within the Department of Defense and is charged with identifying emerging or future threats and opportunities for the United States.[4] Prior to 2013, Halper had produced two similar analyses for the Office of Net Assessment; The Iraq War in 2005 and The Afghan End Game in 2010.

A comprehensive study of China: The Three Warfares would turn anyone into an expert on China’s government, military and diplomacy. This work has six project advisors, three of them being retired U.S. Navy admirals. There are 11 Contributors, all field experts, whose papers and interviews are part of the overall project. There are about 850 footnotes to the main report, dozens of primary sources, and hundreds of open source materials and secondary sources (news reports, briefs) cited as well.

Here’s just one example: Uday Bhaskar is a retired Commodore in India’s navy. One of the contributors to China: Three Warfares, Bhaskar wrote a paper appearing within the report titled “China’s Three Warfares Concept Related to India and the Indian Ocean Region.” In it, Bhaskar writes, “There is one suggestion that the Chinese have dug deep into their own historical records of military strategy, going back to Sun Tzu ( c. 540 BC) who laid great emphasis on the imperative of ‘winning without engaging in War.’”[5] Today Bhaskar is the director of Society for Policy Studies, a New Delhi-based independent think tank.

Graphic depicting Century of Humiliation. Source:

Halper and his collaborators provide an important historical context behind China: Three Warfares. The report covers the Century of Humiliation, or the period in China’s history when Western powers, Russia and Japan all extracted concessions from an ever-weaker Chinese government. This began in the 19th Century, and continued into the 20th Century when China suffered greatly at the hands of Imperial Japan. Vestiges of this period exert a strong influence on the thinking of China’s leaders, who are concerned about cultural identity despite their nation’s burgeoning manufacturing base and economy. “Sweeping Western influence is not a new problem,” reads a 2011 opinion article in Xinhuanet, the official news agency of the PRC. “As an importer of cultural products, ideas and technologies since the 19th Century, China has every reason to worry about its cultural identity.”[6]

In more recent times, China believed that the United States and (to a lesser extent) NATO used propaganda and public opinion strategies to obtain widespread support for the first Persian Gulf War in 1990-91, for removing Slobodan Milosevic from Serbia in the late 1990s, and then in the second Persian Gulf War (Operation Iraqi Freedom) in 2003. “Indeed, the ability of coalition forces to undermine popular support for the (removal of the)Milosevic and Saddam Hussein regimes, influence global views, and preserve domestic support are seen by the PRC as key factors in the outcome of each conflict,” writes Dean Cheng, Research Fellow in Chinese Political and Security Affairs in the Asian Studies Center at the Heritage Foundation.[7]


In his project, Halper traces the origins of San Zhan and the Three Warfares (TW) back to 2003, when the CCP published them as “political work regulations” for the People’s Liberation Army (PLA). Although the individual(s) in China who developed the strategy are not identified, writings about it clearly identify TW as asymmetrical and warlike in practice. Its purpose is to change the mindset of China’s political and military leaders away from conventional warfare, expanding areas of conflict to the political realm of opposing nations, to the manipulation of public opinion, and to legal systems as well.

One key aspect of this is a strong desire on the part of the PRC to improve its projection of soft power. Joseph Nye coined this term, defining it as like-minded cultural, ideological and institutional policies, in the Clinton administration.[8] Nye believed that soft power could help a nation, the U.S., shape the world. “If a state can make its power seem legitimate in the eyes of others, it will encounter less resistance to its wishes.” He argued, “if its culture and ideology are attractive, others will more willingly follow.” 

Cheng posits that China has been striving assiduously to counter an American advantage in global access and coverage. For example, in one article China’s propaganda guidelines call for it to seek news dominance (xinwen quan) and information dominance (xinxi quan) on a path resulting in psychological dominance (xinli quan).[9] To this end, in September 2011 the Chinese Foreign Ministry began offering press briefings daily, supplanting the old twice-a-week briefing frequency. China created a 24-hour English language global news service (CNC World English Channel), and expanded its state-owned China Central Television (CCTV) to operate on a more global level.[10]

China has also poured tremendous resources into social media and the internet as well (more about this in Part Two). Additionally, the development and expansion of Confucius Institutes, which have a purpose of promoting Chinese language training and also focus on providing information about China’ education, culture, economy and society, is another strategic effort aimed at soft power. There are more than 530 Confucius Institutes in dozens of countries on six continents as of 2019.

Within the PLA, there are four stated goals in the area of media warfare:

  1. Preserve friendly morale
  2. Generate public support at home and abroad
  3. Weaken an enemy’s will to fight
  4. Alter an enemy’s situational assessment.[11]

To accomplish these, Chinese strategists and tacticians follow “Four Pillars of Media Warfare.” Planners in the PLA pursue these four points:

  1. Top-down guidance: Media warfare is consistent with the larger national strategy, as outline by the senior leaders of the PRC and the CCP. Tacticians follow high-level guidance on both content and timing of any news.
  2. Pre-emption: The first to broadcast or release news on social media gains the advantage by dominating the message and better framing the debate, which then defines the parameters of subsequent coverage.
  3. Flexibility and responsiveness to changing conditions: Operations remain flexible and adjustable given different political and military circumstances.
  4. “All available” resources: China combines peacetime and military operations to pursue civilian-military integration and local unity to leverage both its commercial and civilian assets (news organizations, broadcasting facilities, internet users) in a comprehensive and systematic campaign.[12]

Peter Mattis, who worked as an international affairs analyst for the U.S. and is now a Fellow in the China Program at the Jamestown Foundation, details the steps which China’s government and its PLA take in a crisis with respect to the U.S. They are:[13]

  1. Establish China’s version of the incident: Beijing will issue statements at or near the beginning of each crisis in order to establish a Chinese position on what happened.
  2. Statement of principles for resolution of an incident: Chinese officials will point to these at the start of any negotiations as setting parameters for discussions to follow. These are considered minimally-acceptable points which meet Beijing’s commitments to the Chinese public. This incorporates the TW concept, and are described for both foreign and domestic audiences.
  3. Shut down any unofficial but normal information channels: U.S. officials often complain that their Chinese counterparts will refuse any communication, including personal channels, once a crisis begins. That is because PRC leadership is establishing information control and dominance of the media and (if possible) social media to as to continuously frame and shape the ensuing debate.
  4. Emphasize Beijing’s commitment to the U.S.-China relationship: This is a version of the “blame game” which PRC will play by making the crisis a testing point of U.S. goodwill and future intentions towards China. Usually in the onset, Beijing will firmly express its own commitment to bilateral relations, implying that Washington does not take the relationship as seriously as does China. [14]


The Director of National Intelligence, the Department of Homeland Security, representatives of the FBI and the Northern California Regional Intelligence Center, the MITRE Corp, Booz Allen Hamilton, and others have written an Analytic Exchange Program White Paper titled Combatting Targeted Disinformation Campaigns. It was published in October 2019. [Email me at, and I’d be glad to share a copy with you.] A sentence of the executive summary reads,“….disinformation campaigns should be viewed as a whole-of-society problem requiring action by government stakeholders, commercial entities, media organizations, and other segments of a civil society.”[15] Another section says the following:

A targeted disinformation campaign … is more insidious than simply telling lies on the internet. One untrue meme or contrived story may be a single thread in a broader operation seeking to influence a target population through methods that violate democratic values, societal norms, and in some jurisdictions, the law.

A disinformation campaign occurs when a person, group of people, or entity (a “threat actor”) coordinate to distribute false or misleading information while concealing the true objectives of the campaign. The objectives of disinformation campaigns can be broad (e.g., sowing discord in a population) or targeted (e.g. propagating a counternarrative to domestic protests) and may employ all information types (disinformation, misinformation, malinformation, propaganda, and true information). The target of a disinformation campaign is the person or group the threat actor aims to influence in order to achieve the campaign’s objectives.

In the White Paper, AEP chronicled the PRC disinformation campaign aimed at discrediting the protestors and the larger pro-democracy movement in Hong Kong in 2019 as one of its two examples of disinformation campaigns. Citing The Guardian(UK) and the Washington Post, AEP pointed out how social media platforms removed or suspended more than 200,000 fraudulent accounts circulating false information.[16]

The New York Times reported that China’s strategy was to create an alternative version of events which it claimed would only lead to bloodshed and violence. China asserted that last year’s protests were not supported by Hong Kong residents and provoked by foreign agents. China’s goal is to undermine sympathy for the seven million residents of Hong Kong and for the protesters’ demands for greater freedoms. This stems from a Chinese government-written bill presented in spring 2019 that would have allowed residents accused of crimes to be sent for trial  to places with which Hong Kong has no extradition treaty, mainly mainland China.[17]

As the protests continued throughout the summer of 2019, China ramped up a two-pronged DI campaigned aimed at internal and external audiences. The AEP White Paper pointed out that the “threat actor” originating this disinformation campaign was the Chinese government, and that the campaign established fake Facebook and Twitter profiles as Americans living in Nevada, Ohio and Texas. Additionally, the Chinese used their state-run media (China Daily, Xinhua News, CGTN) to place paid advertisements on Twitter and Facebook. The main purpose of the campaign was to discredit the pro-democracy movement. The campaign pushed narratives praising the police, and depicting the Hong Kong protestors as terrorists and cockroaches.[18]

As the protests escalated, police tactics turned more violent and China intensified an already-aggressive DI campaign aimed at both internal and external audiences. The internal communication has been especially vile and heinous. Here’s an example: Weibo, a Chinese-controlled social media service similar to Twitter, has posts calling for violent action against the protestors.

“Beating them to a pulp is not enough,”one person said about protesters on Tuesday, echoing an increasingly common sentiment on Weibo. “They must be beaten
to death. Just send a few tanks over to clean them up.”[19]

AEP’s White Paper cites the extensive use of bots which DI campaigners employ to greatly amplify their messages. A “bot” is a computer algorithm designed to execute online tasks autonomously and repetitively, simulating the behavior of human beings in social networks and interacting with social media users by sharing information and messages. According to one researcher, in 2017 there were 23 million bots on Twitter, 140 million bots on Facebook and 27 million bots on Instagram. About 5 to 8 percent of all social media accounts are NOT authentic but instead are bots. [20]

According to AEP, China’s government made extensive use of bots to repost and spread false narratives against the Hong Kong protestors.  It classifies “threat actors” as those who are originating the campaigns. If you hit “Share” or “Retweet” a fake story, AEP classifies you as an “Unwitting Agent” of a disinformation effort.[21]

Like their counterparts in Russia, Iran, and elsewhere, the Chinese propagandists and DI campaigners are adept at selecting words, pictures, and phrases designed to evoke emotions and to get you – and others – to “Share” their posts. Thus you might unknowingly help their efforts.

Governments, businesses small and large, and media conglomerates have all become intertwined economically with the PRC over the past 20 years. At the same time, China has greatly expanded its propaganda and DI campaign activity. Only after comprehending Three Warfares, San Zhan, and its ramifications can one reach logical conclusions with respect to how PRC has approached the Covid-19 outbreak. Details are coming in Part Two.


[1] Drew, Kevin “Social Media With Chinese Characteristics,” US News & World Report, June 2016, retrieved from

[2] Zhong, Raymond; Myers, Steven; and Wu, Jin “How China Unleashed Twitter Trolls to Discredit Hong Kong’s Protesters,” September 2019, retrieved from

[3]Halper, Stefan “China: The Three Warfares,” May 2013, Office of Net Assessment, retrieved from

[4] Gady, Franz-Stefan “The Future of Net Assessment at the Pentagon,” June 2015, The Diplomat, retrieved from

[5] Halper, op. cit., page 476

[6] Xinhuanet, “China’s Cultural Security Lies in Openness and Exchanges,” October 2011, retrieved from

[7] Cheng, Dean, “Winning Without Fighting: China’s Public Opinion Warfare and the Need for a Robust American Response,” p. 3, November 2012, Heritage Foundation Backgrounder, p. 3, retrieved from

[8]  Li, Eric “The Rise and Fall of Soft Power,” Foreign Policy, August 2018, Retrieved from

[9] Cheng, op. cit., p. 7

[10] Ibid

[11] Ibid, p. 4

[12] Ibid

[13] Mattis, Peter. ‘Out with the New, In with the Old: Interpreting China’s “New Type of International Relations”’. Jamestown Foundation, China Brief. Volume 13, Issue 9. April 25, 2013

[14] Ibid

[15] Analytic Exchange Program, Combatting Targeted Disinformation Campaign, October 2019, p. 2

[16] Ibid, p. 19

[17] Myers, Steven Lee and Mozur, Paul, “China is waging a disinformation war against Hong Kong protesters,” New York Times, August 18, 2019, retrieved from

[18] AEP, op. cit., pages 19-20

[19] Ibid

[20] Center for Information Technology and Society, “How is Fake News Spread? Bots, People like You, Trolls and Microtargeting,” U.C. Santa Barbara, retrieved from

[21] AEP, op. cit., page 21

Coronavirus also an infodemic, and are you a puppet of Vladimir?

Posts in recent weeks, combined with the work of researchers investigating Disinformation (DI) campaigns, make it apparent that many people have become unwitting actors in DI. If you’re paying some attention to the news, you’ve seen newspaper reports in the past week citing government sources stating that Russia disinformation efforts in 2020 are trying to help the candidacies of Senator Bernie Sanders and incumbent President Donald Trump.

That is the least surprising thing you will read in this post. Put yourself in the shoes of any enemy of the United States. If you wanted to sow discord and dissent in the U.S., and you wanted to do it through the election process, which candidates represent the most extreme factions of the two major parties?

What’s more important is HOW this is happening. Shelby Grossman, a researcher at the Stanford Internet Observatory, recently explained details about Russia’s recent disinformation activities in Africa on the web site Center for African Studies (CAS). The CAS is an academic institution, founded by the State Department in 1999 and funded by Congress. One of its purposes is to study security issues related to Africa. (Aside: Russia has vast interests in Africa, both politically and economically. Grossman documents this excellently in her work.)

Here is what Grossman is seeing in her research on social media and political messages stemming from Russian influence in Africa:

A marked upward spike in views and readership. She tracked new pages and posts that appeared in social networks in war-torn Libya, following Facebook’s removal of Russian-based accounts there in 2018. The “new” pages which sprang up were almost entirely based in Libya. And they were popular.

“Almost across the board, the pages had high levels of engagement. The 73 inauthentic pages we analyzed posted 48,000 times, received more than 9.7 million interactions, and were liked by over 1.7 million accounts,” Grossman says. “This suggests that the content the campaigns created resonated with people who, in turn, responded to it.”

Grossman posits that Russia has developed a new DI strategy and tried that strategy out in Africa in 2018-2019. Russia’s same strategy is now in effect in the U.S. It has three strategic parts:

  1. Reliance upon “in country” posts and pages rather than posts created from people based Russia or some other foreign country.
  2. The use of “local actors” to create and disseminate content. 
  3. By posting in the U.S, these social media “warriors” are trying to circumvent the security systems which Facebook, Instagram, Twitter, and other social media platforms established in reaction to what happened in 2016. The “local actors” might be Russian agents who are living legally in the U.S., or unwitting agents who are simply being paid to spread stories on social media, or a combination of both.


What this means is that U.S.-based social media pages and posters will circulate images and stories using social media to continue their efforts to influence public opinion about candidates and issues. Below is a post which appeared just a few days ago all over the country about Sen. Bernie Sanders. It criticizes Sen. Bernie Sanders by saying “the math” of his tax policies doesn’t add up. Look below.

The mathematics in the post is correct. Its explanation of Senator’s tax proposal is not accurate. But that matters not to the DI campaigners, whose goal is to further divide Americans against one another.

I do not have proof (I’m not an information technology expert), but I strongly suspect that the origins of this and other memes came from Russian actors trying to stir up dissent and divisiveness in the U.S. during this election campaign.

More tips you can use

Want to avoid being a puppet of disinformation efforts? Take some time to educate yourself and use some of the good free tools and plug-ins at your disposal.

Facebook added a Page Transparency feature, so anyone can learn more about  Some of the details displayed there include when a page was created, the primary country where the page is managed, the number of people who manage the page in each country. Facebook has added a Confirmed Page Owner process as well. It’s a step whereby a manager claims ownership of a page or has filed the necessary paperwork to run advertisements about social issues, politics or elections. Here’s a link to the page with details:

Google has developed a Chrome plug-in called CrowdTangle. It is an easy way to see how often any link has been shared, who shared it and what they said. An extension of Chrome, it will show you specific Facebook posts, Instagram posts, Tweets, and Subreddits that mention the link. It also works for Facebook videos, YouTube videos, articles, and more. Go to to put it to work for you.


No, 100,000 plus people have not died in the outbreak of the Coronavirus. The actual death toll is about 2,700. There are about 80,250 cases (stats are as of Feb. 25). And no, the Chinese government is not burning bodies in Wuhan. Members of Poynter Institute’s International Fact Checking Network have noted at least three different groups which are actively involved in spreading misinformation about the Coronavirus, or its formal World Health Organization (WHO) designation of COVID-19 

To counter the lies circulating around Coronavirus, WHO is taking an aggressive stand in communicating with the public and key stakeholders. It is presenting hard facts, doing daily update, and also in dialogue directly with executives at Fortune 500 companies and asking those corporations to share facts with their employees via corporate communication channels. This is an excellent strategy, as research shows employees are highly likely to believe information coming from their employers in such situations.

This image was falsified, and spread tens of thousands of times virally on various social media platforms.

This misinformation about the Coronavirus has become so vast some are even calling it an infodemic. “We need a vaccine against misinformation,” said Dr. Mike Ryan, head of WHO’s health emergencies program, at a WHO briefing on the virus earlier this month.

WHO has collaborated with social media platforms to ensure that their website is at the top of internet searches. The organization is updating its website every day with the latest facts, and it also using Twitter, Facebook, and other platforms to release factual details.  The WHO and the CDC’s websites about the Coronavirus are linked below:


I’ve signed up for First Draft’s Live Simulation, and will be at Ohio University for the training on March 2. If you’re a professional journalist working in a newsroom in Cleveland, Columbus, Pittsburgh, Charleston, or other cities near Athens, Ohio, you should consider making the trip and spending a day preparing yourself and your newsroom for disinformation campaigns which will inevitably target you.

Here’s the link with details and where you can sign up:


Unwitting actors, fake screenshots and more

In its October 2019 report, the Analytic Exchange Project (AEP) calls us – Jane Average and Joe Average Americans – possible unwitting actors in Disinformation (DI) campaigns. With specificity, AEP is describing people who see something on social media and then pass it along on their Facebook, Twitter, Instagram and other accounts. They don’t examine the source of the information carefully, usually because it matches something which they believe.

Here’s a recent example: There is a satirical “news” site called The Babylon Bee. Ponder this title for a few seconds, since the city of Babylon no longer exists. Its ruins are about 60 miles southwest of Baghdad. If one does a basic “Google” or “Safari” search on it, we find that the Babylon Bee is a website comprised solely of satire.  Thinks “News Update” from Saturday Night Live when you see anything from the Babylon Bee, The Onion, or other satirical news sites.

Just before Christmas (Dec. 25), The Babylon Bee made up a story which “reported” that Donald Trump has done more for Christianity than Jesus. Here are a few sentences from the story: “Look what I’ve done,” he (Trump) said. “You can say ‘Merry Christmas’ now. In fact, if you say ‘Happy Holidays’ and don’t immediately make it clear you’re referring to Christmas, you go to prison.”

Anyone who’s beyond the fourth-grade reading comprehension level should be able to figure out the narrative is satirical in nature, don’t you think?  Here’s a link to the original post.

Yet this past week, I’ve noticed numerous present and former elected officials in Northeast Ohio sharing this story from The Babylon Bee on their social media sites, frequently with along with a lengthy commentary about how terrible of a President and person Donald Trump is.

Of course the commentaries were all the same, virtually word for word, as if they were a “cut and paste” from a master source.

This is an illustration accompanying another Babylon Bee story

STRONG SUGGESTION: If you see a satirical story appearing on a friend’s social media account, containing a long opinion post about the subject of the satire, point it out! Post onto their account that the story is made up, and that they shouldn’t be using it to comment upon or criticize others. Most of us don’t want to be unwitting actors in DI campaigns. Don’t be spread false stories, even satirical ones, irrespective of your political bias.


Lots of fake “screenshots” of supposed news stories still circulate on Facebook, Twitter and elsewhere on social media. Susan Benkelman of the American Press Institute wrote about it, and it appears on the weekly column of the International Fact Checking Network in this post.

KEY TAKEAWAY: People need to be reminded that because something looks like a legitimate screenshot doesn’t mean it is one. That may seem obvious in today’s environment, but as long as people keep falling for fakes, it’s worth repeating. (See bottom of this for more proof.)

Evelyn Douek is an S.J.D. Candidate at Harvard University who’s studying Digital Constitutionalism and Techlash. Writing for the Knight First Amendment Institute at Columbia University, Douek chronicles the creation, growth, operation of global “content cartels” (her description) and vital need for transparency and accountability in industry-wide content removal decisions. It’s an excellent and insightful paper.

KEY TAKEWAY: One of my favorite questions anytime a student suggests that “the government” should step in to regulate social media communication is Who Decides Who Decides? In a similar vein, Douek posits this: “Those concerned with monopoly power over public discourse should similarly be concerned about the rise of content cartels. But is it possible to keep the baby of helpful collaboration and throw out the bathwater of harmful cartels? In some areas and for some problems, platforms working together can be beneficial. But in which areas and how platforms collaborate is as important as that they do.”

Kudos to Facebook and its cybersecurity team, headed by Nathaniel Gleicher, for their ongoing efforts to remove DI campaigns and their harmful effects from the Internet. Last week FB security announced it removed three unconnected networks and groups of malicious actors from its accounts.

From the release: “The first operation originated in Russia and primarily targeted Ukraine and its neighboring countries. The second originated in Iran and focused mainly on the US. The third network originated in Myanmar and Vietnam and targeted audiences in Myanmar. Each of them created networks of accounts to mislead others about who they were and what they were doing.” Here is the link:

Finally, kudos also to First Draft for its ongoing full-day Live Simulations, training and immersion events which are helping journalist and journalism students to build core competencies in identifying disinformation and other online threats. Their next sessions are in El Paso, TX (2/21), Austin, TX (2/22) and Athens, OH (3/2).

Claire Wardle, Ph.D., founder and co-director of First Draft, has been addressing the problem of DI and DI campaigns for several years now. Here is a link to their web site. From there you can learn more and sign up for the Live Simulations.

FINALLY: Heed Mr. Lincoln’s advice.

BRIEFS (Jan. 27, 2020)


A poll last week provided some insight into public opinion about disinformation. The results of one question are highlighted in my “Fact of the Week” visual.

Also, according to the survey, 35 percent of the public identifies “Misleading information” as the biggest threat to keeping our elections safe and accurate.

Telling excerpt: “Although there is no evidence that any votes were changed by a foreign power in 2016 or 2018, almost 4 in 10 Americans surveyed said they believe it is likely another country will tamper with the votes cast in 2020 in order to change the result.”

Insightful: the poll designers didn’t ask, or probe for, what the public thinks about what should be done with the malicious actors from Russia, China, Iran, and other nations which develop and propagate disinformation campaigns against the United States.

    *     *     *    *     * *    *     * *    *     *

If you are reading and/or sharing stories online or in social media from the American Herald Tribune, you’re an unwitting actor of an Iranian disinformation campaign. Below is a link to a CNN story exposing this online publication as being sponsored by Iran. An excerpt:

“As Russia did around the 2016 election, Iran appears to have co-opted and in this case paid a small number of unwitting Americans to lend legitimacy to its operations.”


Jim Lehrer, a first-rate journalist and one of the creators of the News Hour on PBS, died this past week. He was highly respected in his profession, and was an excellent author as well.

Lehrer had eight rules to live by as a journalist. The Wall Street Journal link below contains them. How many of these eight do journalists follow in 2020?

Mid-Jan. 2020 DI roundup


Lately there has been intense focus on Facebook and the decisions which this social media corporate giant has taken with respect to tools of disinformation. Facebook executives have chosen to give their users a lot of latitude in what some might call fake news, reporting on “deep fake” videos, etc. In this professor’s opinion, that is the correct approach. Here’s a simple suggestion: Rather than react to what others post about Facebook, why not look at the company first hand through its own statements?  Here’s a link to the corporation’s news releases:

With specificity, here’s a link to the January 9, 2020, news release Facebook made about its political ad policy. In this release, Facebook announced ways its users can implement additional controls on their FB and Instagram accounts.

Here’s an excerpt:  We have based ours (policies) on the principle that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinized and debated in public. This does not mean that politicians can say whatever they like in advertisements on Facebook. All users must abide by our Community Standards, which apply to ads and include policies that, for example, ban hate speech, harmful content and content designed to intimidate voters or stop them from exercising their right to vote. We regularly disallow ads from politicians that break our rules. 

If this if great interest, you might also pay attention to what policies Facebook implemented in November 2019 for the December United Kingdom elections.  That story is just beneath the above link in the Facebook online newsroom.

We all have a large variety of tools at our disposal for checking the veracity of social media posts which we see on our news feeds.  In past posts, I’ve suggested websites such as where you can “copy” and “paste” to learn the origin and usage of any picture or meme you might see on social media. Here are some other tools:

Martin Luther King, Jr. (left) at Oberlin College, 1965. See bottom of this post for why Rev. Dr. King is included in this story.

This is part of the Annenberg Public Policy Center.

This site, part of the International Fact Checking Network, monitors and updates about 3,000 media outlets.


Bob Frantz, whose Bob Frantz Authority show runs each morning on WHK 1420 “The Answer,” interviewed me on January 13 about disinformation. Here’s a link to the interview. Forward to about halfway through the first hour of the podcast to hear the story.

My professorial colleague and friend Ron Rychlak gave an interview to Joshua Phillip of Epoch Times on disinformation. Rychlak and I are fellow alumni of Wabash College. Today he is the Jamie L. Whitten Chair of Law and Government at the University of Mississippi. He’s also the co-author of “Disinformation” a work which excellently chronicles the history of DI Campaigns of the former Soviet Union and (today) Russia. Phillip also does some education about propaganda as well as on how China conducts DI campaigns in this interview. Here’s a link:


The Harvard Kennedy School is launching an academic journal titled the Harvard Kennedy School (HKS) Misinformation Review, an open-source, interdisciplinary, scholarly journal focused on all aspects of misinformation. Above is a link to some details.

The Reporters’ Lab focuses on journalism research. It’s based in the Sanford School of Public Policy at Duke University. Among other things, Reporters’ Lab maintains a database of global fact checking. It is now also seeking to establish a Media Review, a system to investigate both photos and videos to see if they have been altered or manipulated in any way. Below is a link to a story about this:


For about a year and a half, the Associated Press has been provided a weekly “Fake News” feature for its members. My local paper, the Akron Beacon Journal, has run this feature in its Sunday paper for some time now. Here’s a link you might want use and refer to on a regular basis, as it serves as another useful fact-checking mechanism (albeit only once a week).

Eric Wemple is the media critic for the Washington Post. Over the past several weeks, he has written a 10-part series about the Steel Dossier. One might give his series an unofficial title: “How the media got the story wrong.” This is not a trivial matter, as news coverage about the this dossier, now acclaimed as inauthentic, served as the catalyst for a floodgate of media stories — stating in some way, shape or form — that Russia colluded with the Trump campaign in 2016. Ponder the Fact of the Week at the top of this blog post as you peruse Wemple’s story, linked below.

If you want to learn more about how the media regards its responsibility in the areas of fake news and disinformation, here is a link to the series of reports from Stanford University’s Institute for the Future on this subject:


The Director of National Intelligence, the Department of Homeland Security, representatives of the FBI and the Northern California Regional Intelligence Center, the MITRE Corp, Booz Allen Hamilton, and others have written an Analytic Exchange Program White Paper titled Combatting Targeted Disinformation Campaigns. “We view this issue (DI campaigns) as a whole-of-society problem requiring a whole-of-society response,” reads a sentence in the paper’s executive summary. The white paper has five sets of strong recommendations. Here is a link to the white paper.

Analytic Exchange Program’s “Combatting Targeted Disinformation Campaigns”


Vanessa Otero speaks at West, February 6

Cuyahoga Community College’s Carol Franklin Social Science Speaker Series will have Vanessa Otero, J.D., appearing at its Western Campus on Thursday, February 6. Otero is the creator of a Media Bias Fact Chart. In a previous story, I mentioned the importance of recognizing media bias and encouraged the use of a chart or frame of reference as we read, view, and listen to media. Above is a graphic containing details about Otero’s visit.

This will be the last post on this site for a few weeks, as I re-focus my efforts on additional research about how DI campaigns are affecting democracies and public discourse.  Look for the next post in February 2020.

You can reach me at or at 216-987-5040. Many thanks to my professorial colleagues (including one in New Zealand) who have already invited me to come to their campus and speak on the vital subject of Disinformation and DI campaigns.


In June 1965, Rev. Dr. Martin Luther King Jr was invited to receive an honorary doctorate degree and speak at Oberlin College. Here’s a line from his speech: “We must learn to live together as brothers or perish together as fools.”  (Link to the speech:  “Remaining awake through a great revolution.”)

At a time when the enemies of democracy and freedom are striving mightily to drive us apart as a nation through DI campaigns, Rev. Dr. King’s words ring just as true in 2020 as they did 55 years ago.

Iran DI & propaganda campaigns

Recent events in in the Middle East serve to illuminate the mendacity behind Iran’s government when it comes to communications. Iran has become adept at both propaganda and DI (disinformation) campaign activities. The propaganda aspect has been on full display over the past two weeks. Some American media outlets have swallowed Iran’s propaganda statements “hook, line and sinker” as the old expression goes.

But many in the media and public are unaware of the extent of Iran’s DI program, and how the two are intertwined. Iran has been more active in using social media for disinformation than Russia’s so-called “Internet Research Agency” was during the run-up to the 2016 elections in the U.S. Here are some details:


According to Freedom House, Iran is one of the least free nations on earth. With “1” being most free on its scale, Iran ranks 6/7 in the areas of political rights, civil rights, and with its overall freedom rating. Iran has killed at least 143 journalists and 24 foreign journalists reporting there since 1992, according to the Committee to Protect Journalists. Reporters Without Borders describes Iran as “one of the world’s most repressive countries for journalists for the past 40 years.”

Two months ago, Iran effectively shut down internet access in the country over rising public protests over fuel shortages and other government actions. According to the NGO (non governmental organization) NetBlocks, which monitors internet and cybersecurity issues worldwide, Iran’s internet traffic was reduced by 95 percent in November 2019.

“Journalism in Iran near extinction” is the story which the Washington Post’s Jason Rezaian wrote in August 2019. Rezaian chronicles both the extreme censorship Iran does of Iranian journalists still trying to write fair news stories, and how Iran is worsening its already-horrible treatment of foreign journalists. Perhaps no one alive is more knowledgeable about this than Rezaian. The Post’s Tehran correspondent from 2012-2016, he was unjustly imprisoned for 544 days in Iran.

How much of this has been reported in the news media you have relied upon for stories about rising tensions between the U.S. and Iran? Most likely, all you saw were stories which used Iranian propaganda as sources. The more Iran cracks down on journalists, the more its government officials issue outright lies about what happens in its nation.

For example, MSNBC’s Tehran News Bureau chief reported on the air on the night of February 7 that Iran’s missile attacks had killed 30 American soldiers and leveled  Al-Asad base in Iraq.  MSNBC was broadcasting statements, completely unverified, from Iran’s government.  It turned out there were no American deaths stemming from the missile attack. The base suffered minor damage.

Of course those living in Iran were told otherwise. Iranian television news said the missile strikes killed 80 “American terrorists” and gave the attack the name Martyr Soleimani.

Then, just several hours later, NPR and many other media outlets reported on the crash of Ukraine International Airlines Flight 572, a civilian airliner which flew out of Tehran airport early January 8. The news stories almost all accepted Iran’s initial accounts that a “technical malfunction” caused the crash. The NPR account cited four different Iranian sources in its story, broadcast nationally and worldwide on the morning of January 8.

Iran made other claims in ensuring days, including a statement that it would not turn over the planes “black boxes” or flight data recorders to the plane’s manufacturer, Boeing, or to any U.S. investigators. On Friday, January 10, Iran’s government spokesman Ali Rabiei said the suggestion that Iran shot down the plane was a “big lie.”

Yet as this unfolded, many nations came forth with satellite, radar, and other evidence proving that surface-to-air (SAM) missiles had shot down the passenger jet. Iran finally admitted on January 11 that its military forces had fired SAMs which brought down Ukraine Flight 572, killing 176 people. (It is believed Iran used a Tor M1 missile defense battery which Iran bought from Russia in 2005 to shoot down the jet.)

In this same time frame, Iran has been employing its DI resources to try to convince the world that all this was the fault of US President Donald Trump. That’s no surprise, as Iran has been utilizing DI campaign strategies for more than a decade now and greatly increased its disinformation efforts in the past two weeks.


Patrick Tucker, Technology Editor of Defense One, points out how Iran has recently ramped up its DI campaign efforts. Tucker quotes, Alireza Nader, a senior fellow at the Foundation for the Defense of Democracies, as saying that “not equal to Russia, perhaps, but nevertheless dangerous. The regime is known for its hacking capabilities and spends a considerable amount of resources trying to shape discourse on social media….I’m seeing a huge propaganda push by the regime after Soleimani was killed (on January 2).”

Tucker, Nader, and others all report that Iran flooded Twitter with pro-Iran tweets and memes following the January 2 drone strike that killed Soleimani. At least two “pro democracy” websites propagating such social media messages are inauthentic and based our the Iranian propaganda bureau (see more below).

Clint Watts, a senior fellow at the Foreign Policy Research Institute, said in Defense One that when playing to a U.S. audience, the Iranians will focus on issues related to race, police brutality, and discrimination against Muslims. Iranian DI campaign’s online themes mirror the propaganda they are pushing through conventional TV and other media outlets.

“When you watch their content about the U.S., it’s a lot about the ‘Squad’” Watts says, referring to U.S. Reps. Alexandria Ocasio-Cortez, D-New York; Ilhan Omar, D-Minnesota; Ayanna Pressley, D-Massachusetts; and Rashida Tlaib, D-Michigan.

Watts added that Iran went so far as to create a completely fake version of the Foreign Policy Research Institute website with fake policy briefs. Then Iran targets U.S. lawmakers in an outreach effort to spread the disinformation. “It caused quite a kerfuffle,” Watts says. “They then referenced it on YouTube. It was successful disinformation in the sense that it got policymakers riled up at FPRI. We had nothing to do with it. We weren’t even hacked.”


At least twice in the last 18 months, Facebook security has taken action and eliminated more than 750 accounts, groups and pages which were practicing coordinated inauthentic behavior which originated from Iran and targeted people across multiple internet services in the US, UK, the Middle East and Latin America.

Most recently, on October 21, 2019, Facebook removed 93 Facebook accounts, 17 pages, and four Instagram accounts for violating Facebook’s policy against coordinated inauthentic behavior. The Iranians were masquerading as locals, trying to get others to join their groups and drive people to websites connected to “Liberty Front Press”

The websites “Liberty Front Press” and “Quest4Truth” are neither. They are Iranian government-back DI efforts.  These accounts typically post about local political news and geopolitics, covering topics such as public figures in the US, politics in the US and in Israel, support of Palestine, and support for rebels in Yemen. If you have friends who are citing information from such sources, you might want to let them know that these sites are a part of Iranian DI campaigns.

Facebook continues to beef up its security efforts. Nathaniel Gleicher, FB’s head of cybersecurity policy,  said in a New York Times interview that Facebook is now also applying labels to pages it considers state-sponsored media, such as the media outlet Russia Today, so those reading and seeing their accounts are informed whether the outlets are wholly or partially under the editorial control of their country’s government.


Many thanks to the 35+ faculty colleagues who attended the presentation which my friend David Mastny, Cuyahoga Community College’s director of IT security, and I gave at the Faculty Colloquium on DI Campaigns this past Tuesday (1/7). We received some rave reviews on the presentation. We would be happy to present this to civic and community groups, and I am also glad to share it on other college campuses. E-mail me at or call my office at 216-987-5040 if you are interested.

Here’s a brief preview, from a video shot in late 2019:

This site will contain news/details about DI campaigns from time to time. However, posts here are not my main focus with respect to DI campaigns in 2020.


You MUST do better, NBC News


TO:        Noah Oppenheim, President, NBC News

RE:         Disinformation

DATE:   December 29, 2019

Dear Noah,

You really need to invite Claire Wardle from (formerly) Harvard University’s First Draft Project to visit your newsroom and educate your directors and producers about disinformation. You also ought to book a full “back to class” day of Chuck Todd’s time with Claire. Although he’s making good efforts, In both his recent Rolling Stone interview and in his “Alternative Facts” Meet the Press program on December 29, Todd indicated he really doesn’t understand disinformation (DI), and what DI campaigns are.


Disinformation is manufactured information that is deliberately created or disseminated with the intent to cause harm.

Disinformation feeds off of inauthentic information. Inauthentic information is not transparent in its origins and affiliation. The source of the information tries to mask its origin and identity.

These definitions aren’t from me, or from Russia, or from any unusual source. They stem from Wardle’s work, and all the definitions are incorporated in a seminal work from the Analytic Exchange Program (AEP) report titled “Combatting Targeted Disinformation Campaigns: A Whole of Society Issue,” issued in late October 2019.  The Director of National Intelligence, the Dept. of Homeland Security, the FBI, the MITRE Corporation, Booz Allen Hamilton, and prominent government, business, and educational organizations all contributed to this report.

It should be required reading at NBC News and in newsrooms across the U.S.

Todd doesn’t fully grasp it. Additionally, he is failing to differentiate between disinformation, misinformation, and malinformation. Until and unless you, he and all of NBC News “gets it” with respect to disinformation and DI campaigns, you run the risk of continuing to misinform the public. Do you want to do that?

Here is a chart which briefly explains the three types of “information disorders” (as Wardle calls them) in a diagram:

Claire Wardle “Information Disorder: The Essential Glossary” Shorenstein Center on Media, Politics and Public Policy, Harvard Kennedy School, July 2018


Pew Research tells us that the gap between “Rs” and “Ds” on the role of journalists and journalism has never been greater in our nation’s history. According to Pew’s “Trusting the News Media in the Trump Era” (2018-2019) among highly politically aware citizens there is a 75 percent differential between those who believe that journalists will act in the best interests of the public (91% for Ds, vs. 16% for Rs). Restoring the public’s trust in journalism and the credibility of journalists should be the first and foremost priority for NBC News, The New York Times, the Washington Post, and all media outlets.

With great specificity, the AEP report calls on news media programs to provide transparency regarding sources, authors, and produces of news content. This includes their expertise, funding, conflict of interest and agenda. News media organizations should strive to meet journalism standards of trustworthiness, such as citing sources, correcting mistakes, and avoiding conflicts of interest and political bias.


In Justice Department Inspector General (IG) Michael Horowitz’ December 9 report on the conduct of the FBI, the IG debunked the work of Christopher Steele and Steele Dossier as being both false and unverified. Additionally, Horowitz reported that the FBI failed to inform the FISA Court about this fact, but instead continued to apply (and receive approval on) FISA warrants to keep several U.S. citizens under constant surveillance. Subsequent media and government reports have affirmed this to be the case. The FISA court itself has assailed the FBI for its failure to follow the law.

So, when Todd interviewed Washington Post Executive Editor Marty Baron, The New York Times Executive Editor Dean Baquet, he should not have been throwing them underhanded softball-type questions. These media sources reported dozens of times on the Steel Dossier in early 2017 and continued on in 2018 and 2019. Their stories went a long way toward creating and sharing a narrative about the Donald Trump candidacy and 2016 election campaign which the IG’s reports and other government sources are now describing to be completely false.

That, Noah, is far worse than “alternative facts” because at least many media outlets are challenging false statements, whether they be from Republicans or Democrats.  DI campaigns count on you – the media – to help them spread half-truths and falsehoods. That’s what you must be on guard against, because when you do so you play right into the hands of malevolent DI actors.

Here’s what Todd should have asked:

  • What were the sources of your reporters’ stories on the Steele Dossier?  If they provided information to your outlet “off the record” or “on background” did you accept that as fact and without verification? If it has been proven subsequently that the information was false, what steps have you taken to correct your stories?
  • Did your media outlet make a strong effort to ascertain who paid for the Steele Dossier?  If not, why not? “Follow the money” is one of the oldest adages student journalists learn in their public affairs reporting courses. Are media outlets today still doing this? When did NBC News, the Washington Post, and The New York Times do stories on who funded the Steele Dossier?

The late Tim Russert was a strong and fair interviewer who relentlessly pursued his subjects with questions such as these. He’d be rolling over in his grave if he saw the December 29 edition of “Meet the Press” and understood how far this program has fallen from the high journalist standards it established when it began in 1947, and especially  compared to the high standards maintained under Russert’s watch from 1991 until 2008.

For the sake of an informed populace, Noah, and to prevent further spreading of misinformation, do much more to get your act together at NBC News.

How objective is your media? Do you check your sources?

As 2020 approaches, it’s apparent that the upcoming election will be even more divisive than 2016 was. Brace yourself for a bumpy ride, especially if you’re a Democrat. It’s worth repeating the obvious: the sources where conservatives and liberals “get” their news are quite different.

A study from Pew Research in 2017 found that 40 percent of Trump supporters said that their main source for news is Fox News. Next was CNN (8 percent) and Facebook (7 percent). For Clinton supporters, CNN was their main news source (18 percent) followed by MSNBC (9 percent) and Facebook (8 percent).

Of course Facebook doesn’t have a News Department, and neither does Twitter, Instagram, and other social media sites. If a person is getting “news” from a social media site, it means that they are seeing news stories posted on their personal news feeds based upon the algorithms which the social media outlet has developed.  A 23-year-old female who supports and follows Senator Bernie Sanders will see vastly different social media stories than a right-leaning 62-year-old male – me – will see.


How objective are your news outlets? Are they conducting original reporting, that is assigning reporters to do research, ask questions, and investigate facts? Are they engaging in news analysis? If they present opinion stories, are they labeled as such and are they balanced? There are some of the questions which we, those who consume news and rely on it for forming our opinions, should answer.

Sharyl Attkisson’s Media Map

Fortunately, there are guides to help us in the process.  Two people, Sharyl Attkisson (a long-time reporter, including many years with CBS News) and Vanessa Otero, a patent attorney, have developed charts which “place” the media on charts for bias.  Both charts are below. If you wish visit their websites yourself at and at, and at  (NOTE: There’s a charge associated with downloading and using Otero’s chart.)       

See for the full original version.

If you are concerned about bias in the media you consume, I’d advise you to bookmark this website or save these charts yourself on your laptop, tablet, or mobile device, and refer to one of both of them often. It will help you ascertain the biases of the news you are consuming. This is not a “pitch” for you to look “left” or “right” as you watch or read, but rather to recognize the biases that are inherent in your news.

Here’s the warning: If you find yourself looking at news sources regularly which are NOT on either of these charts, you would be right to suspect that you could be looking at a “fake” news story. See the blog post from November 27 (below) for more details on how to be on guard against that.

There is also a good website,, which you can rely upon for a quick check on the accuracy of news stories.


As we enter 2020, this web site will be devoted to some educational activities. For example, in the U.S. we have five important freedoms given to us in the First Amendment to the Constitution: Freedom of press, freedom of religious expression, freedom of speech, freedom of assembly and freedom to petition (protest) our government when it make poor laws or regulations. All five of freedoms do not exist in many democracies the way the do in the U.S.

Sometimes there are good reasons for this. After Hitler, Germany abolished “hate speech” and has laws regulating free speech and the freedom of the press. They have taken steps to prevent a repeat of the media and expression becoming perverted to support a Third Reich-type of movement. Many nations in Europe have followed this example.

Our globe’s democracies are coming to grips with a major evil, disinformation and disinformation campaigns in various ways. We’ll begin to explore than in January.

Finally – I’m glad to visit your community group or college or university campus to talk about Disinformation, Fake News, and how it’s adversely impacting our society in 2020 and beyond. E-mail me at or call my office at 216-987-5040 if you are interested. My research writing and speaking about these topics are part of my work as a Mandel Faculty Fellow at Cuyahoga Community College. Here’s information about the Mandel Program at Tri-C.

Any Northeast Ohio high school junior or senior who’s interested in developing leadership skills and creating positive change for themselves and their communities would be well advised to look into the Mandel Scholars Academy Program.

Many thanks also to Nicholas Phillips at 1420 AM “The Answer,” who interviewed me for his program The Advocate. Tune in Sunday (Dec. 22) at 8 p.m. to hear the interview.

The statioin’s website is:

DISINFORMATION: Going way beyond 2020

Foreign actors are insidiously interfering with public discourse in the U.S. It’s happening in a manner that’s far more profound than in 2016, when many members of our society first became familiar with the term “Fake News.” These malevolent agents aren’t just trying to impact the upcoming 2020 elections. The objective is far greater: to destroy democracy in the U.S. and in other nations.

This is a developing trend which the American population has sensed before journalism and political science professionals began sounding an alarm about disinformation. According to a Feb.-Mar. 2019 Pew Research survey:

  • 70% of Americans think false information online negatively affects their confidence in the government
  • About half said misinformation is among the biggest problems for the U.S., more than  terrorism and illegal immigration, and
  • 56% think that misinformation will only get worse over the next five years.

Lately, popular media has been paying greater attention to this topic. On November 21, a New York Times story explained how Google was attacking the wrong problem when it decided to ban micro-targeting advertising from its platforms. This move “…retains a system that hackers and trolls have proved adept at exploiting and that social media sites struggle to adequately police,” wrote authors  Matthew Rosenberg and Nick Corasaniti.

Here’s a link to the story:

Earlier this week, Rolling Stone published an article from Clemson University Professors Darren Linwill and Patrick Warren, warning about disinformation. Here’s the story:

A key quote from the story is this: “Russia’s goals are to further widen existing divisions in the American public and decrease our faith and trust in institutions that help maintain a strong democracy. If we focus only on the past or future, we will not be prepared for the present.” 

This blog space will contain much more in-depth details about disinformation in the months ahead. Since Spring 2018, I have been examining the sources and impacts of Disinformation (DI) Campaigns emanating from all portions of the globe, especially from Russia and China. Here’s a brief preview:


What is disinformation? How is it different from propaganda? How can we tell?

Perhaps the best way to explain disinformation is to share the latest definition of the word which its foremost practitioner – Russia —   presented. In its 2011 Draft Convention on International Information Security, Russia listed ‘disinformation’ as one of the main threats to international peace and security. It defined the term as “manipulation of the flow of information in the information space of other governments, disinformation or the concealment of information with the goal of adversely affecting the psychological or spiritual state of society, or eroding traditional cultural, oral, ethical and aesthetic values.”

Do not confuse a  DI campaign – a strategic and intentional effort – with misinformation, which can be defined simply as unintentionally incorrect information. The latter is unfortunate, the former is devastating in its impact. See my November 13 blog post (just below) and video for an example.

According for Professor Ronald Rychlak and Lt. General Ion Mihai Pacepa, a DI campaign employs the following pattern:

  1. Find and use a “kernel of truth” that will lend authenticity and/ or credibility to the DI campaign.
  2. Alter that kernel in a way so that it becomes negative and derogatory towards its target, and ensure that it receives a lot of publicity
  3. Continue spreading the false story until it is reprinted or even appears to come from respected and reputable sources in the U.S. or elsewhere in the West (Europe. For example).

An illustrative example of how this works comes courtesy of New York Times reporters Adam Ellick and Adam Westbrook who published an excellent three-part opinion video series last November titled Operation Infektion.  If you are interested in learning more, here’s a link to their work:

Ellick and Westbrook cite the “AIDS came from the U.S. military” example in the opening of their video series. This false tale appeared in Soviet-supported media for years, until it popped up as a story on the “CBS Evening News” in the late 1980’s. From there it morphed into thousands of iterations in movies, books, and other stories. It’s all untrue.

INTERNET = STEROIDS for DI campaigns

What’s changed immeasurably since the former Soviet Union first began its aggressive DI campaigns in the 1940s and ‘50s is the Internet and social media. See Step No. 3, above? Now it’s no longer necessary to have a favorable media outlet publish a DI story. Instead, trolls on the Internet to the work instead. Dunwill and Warren point out how one Russian troll garnered  290,000 likes for a single Tweet.

In my upcoming monograph, titled “Disinformation: Destroying Democracies,” I will chronicle how DI campaigns have impacted elections and democratic governance in Germany, India, Japan, Canada, and elsewhere. Some of these nations have taken much more aggressive steps against DI campaign than has the U.S. This isn’t a criticism of U.S. policy; however, it’s all the more reason why it’s critical to sound the alarm, learn about this, and warn others.

And you can take steps to help stop the spread of disinformation. Double check EVERY STORY you see in social media before you hit that “Share” button on Facebook, Twitter, What’s App, TikTok, etc.  That’s the most important step.

Friends at Sage Publications developed this simple chart to help us be on guard against the spread of false information.  Copy this, keep it in your mobile device, and “think before you click” or share any social media story. In my talk, I provide a lot more “how to” advice to help you learn how you can help prevent the spread of disinformation.

FINAL POINT: The overall objective of many DI campaigns aimed at us is NOT simply to make you support or oppose “Candidate D” or “Candidate R.”  It is much more insidious than that. The real aim is to further drive a wedge, or widen differences which already exist among varying elements of our society. The more divided we and other democracies become, the more caviar and vodka Vladimir Putin enjoys in the Kremlin each night.

Kerezy is associate professor of Media and Journalism Studies at Cuyahoga Community College. He’ll be glad to speak to college and community groups about “Disinformation: Destroying Democracies” in the Spring 2020 semester. You can contact him at or via phone at 216-987-5040.

About Disinformation Campaigns

For decades, long before “fake news” became a common expression a few years ago, the United States and other democracies have been attacked from other countries using Disinformation (DI) Campaigns. Through research and investigation, I have learned about dozens of efforts to influence and undermine the democratic process not just in the United States, but in democracies across the globe.

Art: Frida Etchell

I am assembling details of these activities for a monograph which will be offered to professional associations in the fields of journalism and political science in early calendar year 2020, titled “Disinformation: Destroying Democracies.” It’s my hope and prayer that, by becoming more knowledgeable about these efforts, civic-minded leaders around the globe will take more active steps to defend their democratic institutions and most cherished rights such as freedom of the press, free and fair elections, etc., against DI campaign activities.

If you are affiliated with a college or university, I am especially interested in traveling to your institution in early 2020 to help “sound the alarm” about DI campaigns. My contact information is below and, if you believe that this is an important subject, hope you will invite me to visit and speak with faculty, students and community on the topic in 2020.

Here’s a preview of how such a talk will go:

More posts will follow in the weeks and months ahead on this site, so please bookmark and come back for more information from time to time. I’d like to express gratitude to my colleague Peter Jennings at Cuyahoga Community College, who’s been a terrific at obtaining professional articles on this subject. Thanks also to Frida Etchell at Tri-C, whose artwork you see on this blog and in my presentations.

Also, many thanks to the Mandel Scholars Academy at Cuyahoga Community College for selecting me as one of its inaugural Mandel Faculty Fellow Scholars. I hope that the research and the communication stemming from my work is worthy of the honor.

John Kerezy, Associate Professor
Media and Journalism Studies
Cuyahoga Community College

Follow me on Facebook and Twitter as well