Today and every day, Facebook gets over 8 billion video views. There are 100 million hours of video watched on Facebook daily.
There are more than 1 billion mobile device views of YouTube videos, per day, and this statistic dates to a month before the onset of Covid-19. As of February 2020, the average YouTube user watches more than 40 minutes of video per day, and – again – this was the number prior to Covid-19.
We could look at Instagram (more than 100 million new pictures and videos a day), SnapChat (more than 400 million new stories created daily), and other social media platforms as well. The results would state the obvious: We’re awash with videos and pictures, and we get a significant portion of our daily information from these various platforms.
How much of it is true?
How many of all these videos and pictures are real? What percentage might be false, intentionally created and promoted to spread phony narratives to the public?
Each and every time we share a video, picture, or cute meme on the internet we are doing many things, including:
- Validating that we believe the video or picture as being true.
- Asking our FRANs (friends, relatives, acquaintances and neighbors) to look at it, evaluate it, and re-share it.
- Revealing a bit about ourselves (our personal beliefs and values) to others.
So if you spend an hour or more a day on social media platforms, how much time do you devote to checking what you see or read PRIOR TO deciding to share it with others? As much as five minutes? Less?
You should be checking. According to an October 2019 white paper study from Analytical Exchange Project, a combined study done conducted by the Dept. of Homeland Security, Defense Dept. contractors, and others under the guidance of the Director of National Intelligence, disinformation is a major problem facing the U.S. and other democracies.
“These (social media technology) changes have made it easier for threat actors to spread disinformation and exploit the modern information environment, posing a significant threat to democratic societies,” the study says.
So let’s look over a few Disinformation (DI) campaign activities, cover what the National Counterintelligence Security Center (NCSC) is telling us, and review what we can do to check stories before we share them.
DID YOU SEE THE NEWS?
Canadian solders introduced Covid-19 into Latvia. The new 5G cell phone service spreads Covid-19. North Atlantic Treaty Organization (NATO) labs created Covid-19.
Or how about this story: Biden, Pelosi, Kerry and Romney all have sons getting tens of millions of dollars in no-show jobs from Ukraine, and an FBI raid in Cleveland earlier this month is exposing all of this.
All of course are fake news and/or forgeries, and they are just a small portion of the wide variety of lies which malicious actors from Russia are spreading in Europe, the U.S. and elsewhere as part of a broad-based Disinformation Campaign (DI), some aimed at many NATO-member nations. Readers of my blog site www.dicampaigns.com know that, for many months, I have chronicled DI activities stemming from Russia, China, Iran, and other places. Recent releases of information from both the NCSC and NATO verify the nations originating this Disinformation, serving as a warning for anyone interested in the truth and fair and free exchange of idea. Make no mistake: DI campaigns poison dialogue in the public square and are a severe threat to our democratic way of life.
Major media outlets such as NPR and the New York Times ran stories earlier this month (August 7-8) about NCSC Director William Evanina’s assessment about DI activities coming from Russia, China and Iran. The NCSC has made an assessment that China wants to see Donald Trump lose the November 3, 2020, election, and that Russia wants to see Joe Biden lose the election. You can find stories about the NCSC assessment in many locations, but perhaps a great call would be to visit their website and read Mr. Evanina’s statement yourself. Here’s a link:
FAKE POLICE BRUTALITY VIDEOS VIEWED 20 MILLION TIMES
Long-time followers of my blog may have discerned that the malicious threat actors from these countries have differing objectives. One of Russia’s main objectives in DI campaigns is always to advance divisiveness in the United States. In the immediate aftermath of George Floyd’s murder on May 25, that’s just what happened. In June, the Wall Street Journal reported that people operating social media accounts based in Pakistan and Botswana had posted supposedly “live” videos of policy brutality against Blacks in the U.S., and that these videos were viewed at least 20 million times before they were taken off social media platforms. In the violent aftermath, tens of thousands of people have been arrested, and at least 29 people have lost their lives.
We cannot estimate how much the 20 million plus views of fake videos contributed to the violence which accompanied the protests, but we know that 50+ cities in the U.S. had sufficient violence that led to arrests and/or injuries to participants and to police. This pattern of violent response to fake videos continued in August. This statement was buried in paragraph eight of an Associated Press story on the violent protests and looting in Chicago on August 8-9:
“Further ratcheting up the tensions in the city was a video circulating on Facebook that falsely claimed that Chicago police had shot and killed a 15-year-old boy. Posted at 6:30 p.m. Sunday, the video shows upset residents confronting officers near the scene where police shot and wounded an adult suspect they said had fired at them that day. By Monday morning, the footage had been watched nearly 100,000 times.“
Is there a cause-and-effect relationship between fake videos and how people act? Absolutely. The videos are created to capture viewers emotionally, because the malicious actors who conduct DI campaigns are depending on you to react. Their videos and pictures are created with the intent of drawing a reaction strong enough to make you eager to share the content. That makes you what the Analytic Exchange Project calls an “unwitting actor” or an accomplice, helping spread false stories.
STEPS YOU CAN TAKE
According to research, a recent survey of 25 countries, discovered that 86% citizens reported being exposed to fake news. Among them, nearly nine in ten reported having initially believed that the news was real. NATO offers these “top tips” to help citizens better spot and counter disinformation:
Top tips to spot disinformation and stop its spread
- Check the source: Look at the source of the information – who has published it and shared it? A site that does not clearly state editorial responsibility is not trustworthy. On social media, check an account’s handle or username – if it has many random letters and numbers in succession, it could be a bot (an automated account). If you see an unverified account posting content hundreds of times a day, alarm bells should ring. Try using a free bot detector, and employ online tools, such as, which flag and rate misinformation sites. (One of my favorite sites is www.mediabiasfactcheck.com) It does daily updates on true and false news stories circulating on the Internet.
- Check the tone: Disinformation is often designed to trigger an emotional response. Be cautious of content that uses emotional language to elicit a strong reaction. Fear and anger are big drivers that allow disinformation to thrive.
- Check the story: Real news is usually covered by more than one source. If mainstream media are not picking up the story, there’s a good chance it can’t be confirmed. By running a search, you might find that independent fact-checkers have already debunked the story. Fact-checking sites, such as the above-mentioned mediabiasfactcheck.com and BBC Reality Check, allow you to check the accuracy of stories.
- Check the images: Does an image show what it claims? Platforms like Google, TinEye and Bing allow you to run a reverse image search to see where an image appears on the Internet and discover similar images. Tools and applications, such as SurfSafe and Serelay, can also help you determine whether an image has been doctored.
- Check your own biases: Research indicates that people are much less likely to identify disinformation if it aligns with their own beliefs or preferences. Be smart and think about whether you are sharing content because you know it’s true or just because you agree with it.
AT END: A plea
DI campaigns don’t stem from just one country. Disinformation doesn’t come out of just one political party or candidate. Foreign nations striving to interfere in our democratic processes (such as the November elections) would like nothing better than for you to turn a blind eye to what is ongoing on social media.
If you are concerned about democracy, if you believe it’s important for the public to have a legitimate exchange of free ideas, and if you spend any amount of time on social media, you owe it to your FRANs and others to practice the above-mentioned tips BEFORE you share any memes, videos, or stories. Don’t be an unwitting agent for some foreign-sponsored campaign to propagate disinformation to the public. Do the “five checks” before you share media content on the Internet.
Will you tear our nation apart even more, or will you help bring us back together? That choice is in your fingertips every time you communicate on social media.
Many thanks to WJKA-FM (Mark Zimmerman and Gabrielle Collins) and to Nicholas Phillips of WHK “The Advocate” for recent media interviews. The next post on this blog site will be in September 2020. Watch www.jkerezy.wordpress.com for a post there around Labor Day 2020
Link to the Analytic Exchange Project Report:
SOME SOURCES USED FOR THIS STORY
Horwitz, Jeff https://www.wsj.com/articles/live-facebook-protest-videos-drew-millions-of-views-but-some-footage-was-years-old-11591118628, Wall Street Journal, June 2, 2020