Six Fake News Techniques and Simple Tools to Vet Them (Illustrations)

2022-02-21 01:21

Reports

Six Fake News Techniques and Simple Tools to Vet Them (GIJN)

Exposing fake or manipulated images is quite possible with the proper tools and techniques.

In this GIJN tutorial, six fraud scenarios are explored, along with step-by-step instructions on vetting their accuracy or inaccuracy:

1. Photo Manipulation

Photo manipulation is the easiest way to fake news, but also the simplest to expose.

There are two common techniques of photo manipulation.

The first is editing photos in special programs, such as Adobe Photoshop. The second is presenting real photos as having been taken at another time or place. In both instances, tools exist to expose the fake.

You should be able find out when and where the photo was taken and whether it was processed in an editing program.

1.1 Photo editing

Here is a simple example of a fake photo created by editing the original in Adobe Photoshop.

This is the screenshot from page one of pro-Russian groups on the Russian Facebook-like social network, Vkontakte. Spread widely in 2015, it shows a newborn child who allegedly has a swastika cut on his or her arm. The photo has a caption: “Shock! Personnel of one of the maternity hospitals in Dnipropetrovsk learned that a birthing mother was a refugee from Donbas and the wife of a dead militia man. They decided to make a cut in the form of swastika on the baby’s arm. Three months later but a scar can still be seen.”

But this photo is fake. The original can be easily found on the internet. And baby has no wounds.

The simplest way to check a photo is using Google Images reverse search. This service has a lot of helpful functions, such as searching for similar images and other sizes of the image. Use your mouse to grab the image, then drag and drop it into the search bar of the Google Images page, or just copy and paste the image address. From the Tools menu you can choose options “Visually similar” or “More sizes.”

Using the “More sizes” option leads us to a 2008 article. It may not display the original photo, but it proves that the photo couldn’t have been taken in 2015 and it doesn’t contain a swastika.

Let’s look at a more complex photo fake. A bogus photo depicts a Ukrainian soldier kissing an American flag. The photo was circulated before Ukraine’s National Flag Day in 2015, and first appeared on a separatist website with an article entitled The Day of the Slave.

You can refute it in several stages.

Firstly, cut any additional information from photo: legends, titles, frame, etc.,  because it can influence search results. In this case, you can cut the word “Demotivators” in the lower right corner of the picture using free tool, Jetscreenshot (Mac version).

Secondly, try to invert a picture using any mirror effect tool such as LunaPic and save the result.

Then, check it using Google Images or another reverse image search tool. In this way you can find out if the picture is original or edited. And you also can find the real date, place and context of its publication.

So, the photo in fact was taken in 2010 in Tajikistan and the soldier kissing the flag is a Tajik customs official. The Ukrainian flag on his sleeve was added later using a photo editing program and the photo was horizontally inverted using mirror effect.

Sometimes a Google search is not adequate for finding the source of an image. Try TinEye, another reverse search tool.

The main difference between TinEye and Google is that TinEye recognizes identical or edited copies of images. This way, you can find cropped or mounted versions of the same photo. In addition, the sites where the photos are posted can provide additional information about the content of the photo.

This picture has been retweeted and liked tens of thousands of times on Twitter. It depicts a very serious Putin surrounded by other world leaders, all staring at him as if focused on what he is saying. It is fake.

You can find the original one using TinEye. Enter the image address into the search bar or drag-and-drop the image from your hard drive. You can use a “Biggest Image” option to find the possible initial picture, because every edit reduces the size and affects the quality of the photo.

We can see that the picture is taken from a Turkish website.

You can also use other toolbar options like Best Match, Newest, Oldest and even Most Changed to track what changes have been made to the image.

You can also filter the results by domain – for instance, Twitter or other sites where the picture appears.

One more example of complex fakes concerns an image created to mock US President Donald Trump by purportedly showing an unflattering view of his profile.

It circulated on social media in October of 2017,  along with the claim that the president was unhappy with the photo and didn’t want it shared on the internet. In fact, the image was originally posted on Facebook on July 14, 2017, by Vic Berger, a viral video creator:

You can find the source using a mirror effect tool and then TinEye:

Berger created it using a photograph that Getty Images photographer Matthew Cavanaugh took in 2011. Berger flipped the original photo, enlarged Trump’s throat and colored his skin a darker shade of peach.

To explore the photo more deeply, you can use programs that analyze whether it was edited. Among the best ones is FotoForensics.

On January 12, 2015, Rossiya 1’s evening news program Vesti, reported that the Ukrainian political party Svoboda had proposed a new 1,000-hryvnia banknote depicting Hitler.

This is false. Apparently, the source of this fake news photo used Photoshop and posted on pikabu.ru, where it is tagged as “humor” and a “joke.”

The original design for the banknote depicts the Ukrainian writer Panteleimon Kulish.

Detailed analysis with FotoForensics shows that the photo with Hitler was edited. Part of the original image was erased, the filled in with Hitler’s image and the denomination amount.

FotoForensics is a website that uses “error level analysis” (ELA) to find parts of a picture that were added to it after editing. After processing a photo, the program produces the image with the edited parts standing out.

In addition, the program also provides EXIF-data (also called metadata) for each photo. Each image file contains a range of additional information, encoded within a graphic file.

Among other information, in metadata you can find:

  • Date and time of the original
  • Geo-location data
  • The model of the camera and its settings (exposure time, aperture value, etc.)
  • Copyright information

For example, investigators at Bellingcat used it to verify users’ photos related to the Malaysia airline flight MH17 crash. At 4:20 pm local time on July 17, 2014, it was shot from the sky over eastern Ukraine, killing all 298 passengers and crew members. Almost exactly three hours later the following photograph was posted on Twitter:

The tweet from @WowihaY describes how a witness sent the photograph to him showing the cloud traces of a missile launch from the southeast of Torez city in occupied territory of Eastern Ukraine. The metadata of the photograph showed that it was taken at 4:25 pm (five minutes after it was shot down), according to the camera’s internal clock.

However, this method does not provide a 100 percent guarantee of accuracy. Most of the metadata required for authentication disappears while downloading it on the internet.

These results don’t contain the metadata of the original photo because the photo is a screenshot. The same situation applies when a photo is placed on social media.

Despite these caveats, sometimes metadata adds useful information for your research.

1.2 Showing real photos, but done at another time or in another place

Manipulation can be done to present events in a distorted way.

A photo taken in Israel in 2014 was recently posted as a photo takeen in the Eastern Ukraine in 2015.

The fake was first discovered by Israeli journalist and Ukraine-expert Shimon Briman.

You can use any reverse search to verify the authenticity of a photo, cutting out all added elements such as the title. TinEye’s option “Oldest” is very useful here; there are at least two Israel-related results, dated the year before.

You cannot always find the source of the photo in this way. But such a result is a clue to look further in this direction.

The fifth result in the search is a photo from an Israeli newspaper on July 27, 2014, which describes in detail how and when the photo was taken. A girl, Shira de Porto, took it from her mobile phone during a rocket attack in Beer Sheva. The father and another man covered the baby with their bodies.

If a suspicious picture appears on social media, you can use embedded TinEye search tools.

For example, during former US Vice President Joe Biden’s visit to Kiev, a photo of people kneeling outside the Ukrainian Cabinet Ministers’ building was posted on social networks and pro-Russian websites. The caption claimed that these were Kiev residents “appealing to Biden to save them from Yatseniuk,” referring to Ukraine’s prime minister at the time. The photo appeared for the first time on December 6, 2015.

Using TinEye, StopFake found that the original photo was posted on Twitter with #Euromaidan hashtag on January 18, 2015. To figure out the context, you can use the Twitter search tool. Choose “Search Filters” and then “Advanced Search.”

Then you can enter any information – in this case the hashtag and the date, January 18, 2015.

The first search result shows the original tweet with the initial photo. It was taken on Hrushevsky Street of Kiev on January 18, 2015, when thousands of people gathered to pay tribute to the first victims of the clashes during the “Euromaidan” protests of 2013.

There are a variety of tools besides TinEye and Google Images, including Baidu (which works better for Chinese content), Yandex and metadata searching tools like FotoForensics.

If you are going to use these tools for verifying photos often, try ImgOps, which contains the tools mentioned above, and you can add your own. Another is Imageraider.com, analogous to TinEye but with some different features, such as the capacity to analyze several pictures at one time and to exclude some websites from search results.

A Brief Summary
  • Pay attention to the images with the largest resolution and size. The resolution of photos decreases with each new edit, so the photo of the largest size or with the best resolution is the least edited. This is an indirect sign that such a photo may be the original.
  • Pay attention to the date of publication. The image with the earliest date is closest to the original.
  • Reread the photo captions. Two identical images can have different descriptions.
  • Fake photos are not just cropped or edited, but can also be mirrored.
  • You can search the particular website, social network or domain.

2. Manipulating Videos

The manipulation of videos occurs as often as the manipulation of photos. However, exposing fake videos is much more difficult and time-consuming. First, watch the video and look for discrepancies: inaccurate gluing, distorted proportions or strange moments.

Look at details: shadows, reflections and the sharpness of different elements. A country and a city where a picture was taken can be identified by car numbers, store signs and street names. If there are unusual buildings in the photo, look at them in the Street View mode of Google maps. You can also check out the weather for the specific time and place using the archives of weather forecasting websites. If it was raining all day there but the sun is shining in the video, it’s not trustworthy.

One site I like is Weather Underground.

The following technologies are most often used to create fake news with video.

2.1 Using old videos to illustrate new events

A lot of pictures and videos were shared on social media allegedly showing US, French and British air attacks on three Syrian targets on April 14, 2018. For example, this video purportedly shows an early-morning air strike on the Jamraya Research Centre in Damascus.

If a video is embedded into a piece of news, you can go to the original tweet, YouTube video or Facebook post and read comments on it. The online audience, especially on Twitter and YouTube, is very active and responsive. Sometimes there are links to the source there and enough information to refute a fake.

By doing this, we can see the link to the original YouTube video. It shows the correct location, but it was shot in January 2013 during a similar attack which was attributed to Israel.

You also can quickly explore the account which posted a video. What information does it share about the user? What other social media accounts is it linked with? What kind of information does is share?

To find an original video, you can use Amnesty International’s YouTube DataViewer. It allows you to clarify an exact upload date and time and verify if a YouTube video has been posted on the platform before.

Let’s try to check the upload time of the video mentioned in the case above. The DataViewer confirmed that it was uploaded in January 2013.

The next step in verifying videos is the same as for verifying photos – doing a reverse image search.

You can manually take screenshots of the key moments of the video and put them into a search machine like Google Images or TinEye. You also can use special tools designed to simplify this process. Youtube DataViewer generates thumbnails used by a video on YouTube. You can do a reverse image search on them in one click.

In February of 2018, France 24’s Observers debunked a video that claimed to show squads of Turkish fighter jets on a bombing mission over Afrin, Syria. This video filmed from the cockpit of an F16 was posted on several different YouTube accounts on January 21, 2018.

They checked it with the YouTube DataViewer.

The original post does not have a Turkish voice speaking off-camera, which was added later. In reality, this video was made during an aviation display held in Amsterdam.  

2.2 Placing a video — or part of it — in another context

Sometimes you need to find out additional facts about a video to prove that it is a false representation.

For example, this one, posted on YouTube on August 22, 2015, was distributed widely in eight countries. It purports to show Muslim migrants on the border between Greece and Macedonia who allegedly refused food aid from the Red Cross because it wasn’t halal or the packaging was marked with a cross.

To learn more about it, you can use a powerful reverse search tool – InVid. It can help you verify videos on social media, such as Twitter, Facebook, YouTube, Instagram, Vimeo, Dailymotion, LiveLeak and Dropbox. Download the InVid plugin. Copy the video link. Paste it into the “Keyframes” window in InVid and click “Submit.”


Click through the thumbnails one by one to do reverse image search and explore the results.

In reality, the migrants were refusing to take food to protest against the closure of the border and the poor conditions in which they were forced to wait. Italian journalists wrote about it for Il Post after interviewing humanitarian workers onsite. The journalist who filmed this video confirmed this. It was initially posted on its website with the caption: “The refugees refuse food after spending the night in the rain without being able to cross the border.”

One more example of this kind of fake is a post about German chancellor Angela Merkel from Gloria.tv. It is a seven-second video clip in which the chancellor only says one sentence. The video title is: “Angela Merkel: Germans have to accept foreigners’ violence.”

But in reality the sentence was taken out of context and the headline reverses the meaning of her statement, a BuzzFeed News Analysis found. Here is her full statement:

The thing here is to ensure security on the ground and to eradicate the causes of violence in the society at the same time. This applies to all parts of the society, but we have to accept that the number of crimes is particularly high among young immigrants. Therefore, the theme of integration is connected with the issue of violence prevention in all parts of our society.

The video turned out to be a part from a 2011 plot, the German fact-checking website Mimikama wrote.

The best way to find the source of such videos is to use search machines such as Google.

2.3 Building a completely fake video

Creating totally fake videos requires a lot of time and money. It is often used to create Russian propaganda.

One example is the purported “proof” of the service of Islamic State militants in the Ukrainian Special Operations regiment “Azov.” This was presented as the finding of a pro-Russian hackers’ group called “CyberBercut.”

The CyberBercut hackers claimed that they got access to an “Azov” fighter’s smartphone and found the materials. They neither mentioned the location of the footage, nor the technical features of the hacking. The BBC found the location using Wikimapia’s geographic service.

You can also use other mapping services, like Google Maps, to compare the apparent location of the video and the real one. Or use Google Street View service where applicable. In reality, the location was at the Isolyatsia Art Centre in the occupied territory of the Eastern Ukraine.

Sometimes these fakes are clumsy, so they are easy to expose. Just being attentive is enough. For example, Russian media spread the news that “Right Sector” fighters were conducting “lessons of Russophobia” in the schools of Kramatorsk city in Donetsk region in Eastern Ukraine.

The video was distributed on social networks and YouTube and then spread by the Russian mainstream media. One of the schoolboys supposedly filmed these lessons with a phone camera. A man in a British military uniform with a gun in his hand forces children to read aloud the article “What is Russophobia?” Such lessons, he says, will be held in all educational institutions in the territory released from the Russians.

Users of social networks noticed that the schoolchildren look older than their years. The hero of the video is dressed in a Condor style military jacket with a stripe “Thor mit uns.” Such a patch, as well as clothes on both videos, can be bought in any online store.

In reality, this video was a provocation, made by a Kramatorsk activist to check if the Russian media would use it without examination. The author of the video, Anton Kistol, provided StopFake with draft versions, as well as photos from the filming of the video.


3. Manipulating the News

3.1 Publishing a true piece of news under a false title

A lot of people repost articles on social media after reading the headline, but without reading the whole text. Putting a misleading title on real news is one of the most common fake news techniques.

Taking quotations out of context is another common trick.

For example, in December 2016 Russian media declared that Ukraine’s Foreign Ministry had accused the European Union of betrayal. Russia’s official news agency RIA NovostiVesti and Ukraina.ru  featured stories claiming that Ukraine suspected the EU of machinations and even treachery.

They cited an interview with Olena Zerkal,the deputy minister of Foreign Affairs in Ukraine for European Integration, in the Financial Times:

This is testing the credibility of the European Union… I am not being very diplomatic now. It feels like some kind of betrayal… especially taking into account the price we paid for our European aspirations. None of the European Union member countries paid such a price.

While visa-free travel for Ukrainians had in principle been agreed upon with the EU, it had yet to officially begin. Zerkal was expressing frustration with the lengthy process despite the fact that Ukraine had met all the conditions. She was not exactly accusing the EU of betrayal.

Another example is from the blog Free Speech Time. It posted an article on May 6, 2018, titled: “Watch: London Muslim Mayor Encourages Muslims to Riot During Trump’s Visit to the UK.” It began:

London Muslim mayor incited Islamic-based hatred against president Trump. He took every opportunity to lash out at the US president for daring to criticize Islam and to ban terrorists from entering America. Now he warns Trump not to come to the UK because “peace-loving” Muslims who represent the “religion of peace” will have to riot, demonstrate and protest during his visit to the UK. 

Sadiq Khan himself incited hatred against the US president among British Muslims. Shame on a Muslim mayor of London.  

The post includes this video as proof. However, the article provided no evidence for the claim made in the headline. An embedded interview video excerpt simply captured Mayor Khan stating:  “I think there will be protests, I speak to Londoners every day of the week, and I think they will use the rights they have to express their freedom of speech.”

When Khan was asked directly by the interviewer whether he “endorsed” such protests, he responded by saying: “The key thing is this — they must be peaceful, they must be lawful.” He doesn’t mention the words “Muslim,” “Muslims” or “Islam” once during the clip, as Maarten Schenk writes in Lead Stories.

If you need to check a quote, you can find a source using Google Advanced Search. You can define the time parameters and the site you are looking for. Sometimes the initial bit of news gets removed from the primary source, but spreads via other ones. You can locate the deleted material using a Google Cache search or by looking at the archive of the source by date.

3.2 Presenting opinion as a fact

While reading an article, ask yourself: Is it is a fact or someone’s opinion?

Some Russian media said that Turkey would be thrown out of NATO on November 2015. Ukraina.ru reported: “Turkey should not be a member of NATO; it should be thrown out of the Alliance. This was announced by retired US Army Major General and senior military analyst for Fox News Paul Vallely.”

In fact, explained Stopfake.org, a retired military officer cannot speak for NATO or its members. Vallely was a critic of US policy and of then-US President Barack Obama. Obama had spoken in support of Turkey.

3.3 Distorting a fact

A story by news channel Russia Today told the story of Jewish people fleeing Kiev due to anti-Semitism by the new Ukrainian government, citing Rabbi Mihail Kapustin.

But a basic search showed that he wasn’t a rabbi of a Kiev synagogue, but instead the rabbi of one in Crimea. Having called for the defense of both Ukraine and Crimea from Russia, he was fleeing Crimea due to the new Russian government there, Stopfake.org found.

3.4 Presenting completely made-up information as fact

Basic searches can expose the falsity of some claims.

One prominent example in Ukraine concerned a supposedly “crucified boy.” But there was no evidence at all for the 2014 claim by one woman made on the Kremlin’s official TV Channel One, according to Stopfake.org. She turned out to be the wife of a pro-Russian militant.

A lot of reports about so-called ISIS training camps in Ukraine appeared in Spanish-language media in 2017, but Advanced Google searches revealed no evidence for this, Stopfake.org reported.

Fake new creators also try to manipulate quotations, or even manufacture them.

Former Facebook Vice President Jeff Rothschild allegedly called for “a third world war to exterminate 90 percent of the world’s population.” But the supposed quote, first found in the Anarchadia blog, had no basis in fact at all, according to the fact-checking site, Snopes.com.

3.5 Neglecting important details that completely change the context of the news

In March 2017, Buzzfeed published a story claiming that Ukrainian Prime Minister Volodymyr Groysman agreed that Ukraine would help Turkey with refugees from Syria.

Citing a report in the Ukrinform state news agency, Buzzfeed contributor Blake Adams wrote that Ukraine would establish three refugee centers, attributing this to Middle East Research Institute director Ihor Semyvolos. But Semyvolos did not say anything about refugees or refugee centers, as he  pointed out in a Facebook post.

Buzzfeed also linked to a Facebook post by Yuriy Koval, who in turn linked to a blog called Vse Novosti and someone called Mykola Dobryniuk, both claiming that Ukraine agreed to take in Syrian refugees from Turkey. Suspiciously, neither Koval nor Dobryniuk have any other posts on the Internet.

After StopFake debunked this fake story, Buzzfeed pulled it from its site. 


4. Manipulation with Expert Assessments

The next method of falsifying reality uses fake experts or misrepresents real experts.

4.1. Pseudo experts and think tanks

Real experts are often well-known locally and in the professional community. They guard their reputations carefully. Pseudo-experts, on the other hand, often appear once and then disappear. To verify the authenticity of an expert, it is worth looking up his or her biography, social networking pages, website, articles, comments to other media and colleagues’ feedback on his or her activities.

On September 30, 2014, the newspaper Vechernyaya Moskva published an interview with Latvian political scientist Einars Graudins, who was presented as an “OSCE expert.” However, this person had no link whatsoever to the OSCE, the Organization for Security and Co-operation in Europe. This was confirmed by the OSCE mission in Ukraine through its official Twitter account.

First, look for these experts on the website of the relevant organization. If they are not there, contact the organization. The easiest way to do this is via Twitter or Facebook. Reputable organizations are interested in stopping the spread of fake news about them and their experts.

Some pseudo-experts appear in the media frequently. NTV in Russia reported on the “stormy reaction in the West” caused by Vladimir Putin’s statement from March 3, 2018, that the US was no longer the leading military power. This view was expressed by Daniel Patrick Welch, presented as an American political analyst.

But a Google search by The Insider found that Welch described himself as “writer, singer, translator, activist-singing poet.” He occasionally published articles on politics in little-known online publications in which he criticized US policy as militaristic and expansionist. Welsh sympathized with the militants in eastern Ukraine and called Ukraine’s official authorities the “junta controlled by Washington.” In Russia, the largest news agencies and TV companies refer to him and take comments from him.

What appear to be reputable think tanks may be questionable.

A senior fellow at the Atlantic Council, Brian Mefford, debunked one such organization, called the Center for Global Strategic Monitoring. It wrongly mentioned him on its website as its expert. He unsuccessfully searched the website for contact information to request that his name be removed.

The Center’s website appears to be an impressive and thoughtful news and opinion site at first glance, Mefford wrote. But it’s easy to discover that the organization is phony. First, the website re-publishes analysis and opinion pieces from real, respectable research institutions, apparently without permission. Then these legitimate pieces are mixed with “news” from Russian-controlled sources without attribution. The website even posts some fake articles under the names of distinguished think tank scholars.

4.2. Inventing experts from scratch

Sometimes completely fake persons are presented in the media as experts to promote particular political views or push the audience to certain decisions.

For example, “Senior Pentagon Russia Analyst LTC David Jewberg” maintained a popular Facebook page and was frequently quoted in Ukrainian and Russian media as a Pentagon insider related to topics concerning Ukraine and Russia. He represented himself as an actual person with the legal name “David Jewberg.” A number of well-known Russian opposition figures frequently cited Jewberg as a respected analyst and real-life contact.

In their investigation, Bellingcat found that Jewberg was an imagined persona connected to a group of individuals inside the US revolving around American financier Dan K Rapoport. A number of his personal friends and professional contacts helped prop up this fake persona. Photographs from a college friend were used to represent Jewberg and a number of Rapoport’s friends wrote about Jewberg as if he were a real person.

Another example is Drew Cloud, who has been occasionally quoted as one of the leading “experts” on US student loans. It has been revealed he’s fake. This person pitched student loan stories to news organizations and offered to do interviews by email. Cloud has often appeared on financial-advice sites as a guest writer or as the subject of an interview. He doesn’t say where he attended college, but he does say that he, too, had taken out student loans. When people reached out to Cloud for his expertise on student debt, he often suggested that they refinance their loans.

It took the Chronicle of Higher Education to reveal that Cloud was a fictitious character created by the The Student Loan Report, a website run by a student loan refinancing company.

4.3. Twisting experts’ statements or faking them

Often, manipulators distort the meaning of experts’ words, particularly by pulling phrases out of context.

In May 2018, tweets and blog posts about a television appearance by sexuality educator Deanne Carson went viral on social media. Users railed against her purported advice. She allegedly said parents should ask a baby’s permission before changing their diaper.

But this was an exaggeration. She said parents could ask children if it is okay to change their diapers to teach them that “their response matters,” noting that it is not actually possible for babies to consent to a diaper change, according to a review of her statement by Snopes.

There are cases when real expert’s opinions are completely faked. To verify them, visit the site of the think tank or organization with which the expert is affiliated. Analyze the focus of his or her research, statements or articles. Do they agree with those in the news?

A prime example is a story “American Victims of Terror Demand  Justice” on a website called CGS Monitor (which has since been taken down). The article was an attack on the US-Saudi Arabia alliance and was allegedly written by the noted Middle East analyst Bruce Riedel of the Brookings Institute. But when asked about it, Riedel confirmed that he did not write the article. 

There were many clues that the article was not written by a native English speaker, the Atlantic Council wrote. The improper placement of nouns and frequent lack of articles like “a” and “the” strongly suggested that it was translated into English by a native Russian speaker.

Strategically, CGS Monitor had re-posted a few articles that were actually written by Bruce Riedel. Thus, there was enough real content from Riedel that an occasional fake opinion piece could go undetected and assumed to be real. While use of such deception techniques would not fool a regional expert, but it might fool someone doing research or editing Wikipedia, as in this example.

4.4. Translating the words of an expert in a manipulative way

This method is often used when translating from English into other languages. Counteract it by finding the original piece and retranslating it.

Western countries, including Germany, imposed economic sanctions on Russia after it annexed Crimea in March 2014.

But the Kremlin’s transcript of an October 26, 2017 speech by German President Frank-Walter Steinmeier on Crimea edited his text to replace the word “annexation.” In the Russian translation, “annexation” became “re-unification.”

A similar “interpreter’s mistake” took place on June 2, 2015, when the Russian news agency RIA Novosti published a piece referring to a Financial Times blog. The RIA Novosti piece omitted negative references to Russia, contained distorted translations and favorably recharacterized the annexation, Stopfake found.


5. Manipulation with Media Messages

Our tendency to trust reputable media and treat them uncritically is used by propagandists and manipulators.

5.1. Using messages of marginal media or blogs

Suspicious messages often are spread by the marginal media with solid-sounding names, claiming they come from reputable media.

Several Russian mass media, including business newspaper Vzglyad, cited “Western media,” when reporting on a dispute over repatriation of the bodies of 13 Americans killed while fighting in the Ukraine.

But the “Western media”Vzglyad cited was an untrustworthy online newspaper, The European Union TimesStopFake learned. The newspaper links went to the site WhatDoesItMean.com. The author of this piece of news, Sorcha Faal, was an invented character who spreads rumors.

To counter this type of manipulation, go to the referenced sources and evaluate their credibility.

In another case, the Russian media quoted what turned out to be an anonymous blog post, Stopfake found. On August 16, 2015, Russia’s RIA Novosti posted an article about the Malaysian Airlines crash. The source was a German portal, Propagandaschau. The portal published an opinion piece by someone nicknamed “Dok” and an article by a former political counselor at the Canadian Embassy in Russia, Patrick Armstrong, which had been posted on the pro-Russian site Russia Insider. RIA Novosti and RT presented Dok’s commentary as expert analysis. Russian media made no mention of Armstrong’s article, which contains claims previously refuted.

5.2. Toggling real messages of reputable media

News reported by the reputable media can get twisted by “fake-news” media.

For instance, a quote supposedly from California Congresswoman Maxine Waters about impeaching President Trump was digitally added into an image pulled from a CNN broadcast, Snopes and Politifact wrote.

In fact, the quote wasn’t hers at all and her image was taken from an interview she did on another subject.

5.3. References to non-existing messages of reputable media

Russian and Moldovan sites circulated a fake story which originally appeared in December 2017 claiming that a gold mine was discovered in Crimea. The Moldovan news site GagauzYeri.md reported February 10, 2016 that Russian geologists discovered the world’s largest gold mine.

The alleged source for this story was Bloomberg, but the hyperlink didn’t go to its website, the first sign that the news might be fake. A search on the Bloomberg website, as well as in Google, turned up no such story, Stopfake discovered.

In another case, a WhatsApp message touting a phony election survey in India was made to sound more credible by the inclusion of a link to BBC’s homepage, even though the BBC had not reported on the survey, according to an analysis by BOOM.


6. Manipulations with Data

Sociological survey data and economic indicators can be manipulated.

6.1 Methodological manipulation

Surveys may have weak methodologies.

For example, at the end of March 2018, Russian media reported that anti-Semitism had grown in Ukraine, but that Ukrainian authorities “are carefully concealing it.”

The Russian website Ukraine.ru cited a 72-page report produced by the Ministry of Diaspora Affairs of Israel showing that Ukrainian Jews had experienced more attacks (both verbal and physical) than Jews in all the republics of the former USSR.

But, the report was not based on a systematic study, nor did its authors analyze the available data collected by organizations that monitor xenophobia in Ukraine. Judging by the sources cited, the authors made a mechanical calculation of the incidents, regardless of the severity or reliability of the information. For example, both real cases of vandalism and verbal insults during the rally were counted.

The most audacious statement of the report is that the number of anti-Semitic incidents in Ukraine has doubled in comparison with the previous year. According to monitoring organizations, the number of acts of anti-Semitic vandalism has grown, but only slightly — from 19 to 24. This is indicated by the figures collected by the National Minority Rights Monitoring Group which has monitored hate crimes in Ukraine for more than ten years. In 2017, no cases of anti-Semitic violence were recorded, and in 2016 there was only one, the head of the group said on Radio Liberty.

A careful look at the report showed it was not an in-depth assessment of the situation. However, the anti-Semitism narrative was one of the important components of the anti-Ukrainian propaganda campaign run by the Kremlin to justify aggression against Ukraine. Therefore, the abstracts of the report concerning Ukraine were readily taken up by Russian propaganda media.

Other contrary evidence appeared in a survey of the US-based Pew Research Center of 18 countries of Central and Eastern Europe, which showed that Ukraine had the lowest percentage of anti-Semitic attitudes in Europe. In Russia, the document says, this level is almost three times higher.

The website Ukraine.ru criticized Pew’s methodology arguing that asking respondents “Would you like to see Jews as your fellow citizens?” was not indicative of love or dislike for Jews.

The Pew’s survey, Religious Belief and National Belonging in Central and Eastern Europe, contains a methodology section explaining the research. It’s important to analyze and understand the methodology.

6.2 Misinterpretation of results

One of the attributes of propaganda is the attempt to appear to be truthful and authentic. So propagandist claims often resort to distorted survey results.

Russian pro-Kremlin site Ukraina.ru published a story about Fitch Ratings’ latest outlook for Ukraine, focusing only on the negative elements and ignoring the overall stable forecast. Using only the first sentence of the Fitch report, Ukraina.ru claimed Ukraine had the third largest shadow economy in the world after Azerbaijan and Nigeria.

The first sentence of the Fitch report reads: “Ukraine’s ratings reflect weak external liquidity, a high public debt burden and structural weaknesses, in terms of a weak banking sector, institutional constraints and geopolitical and political risks.”

This is the only bit of information that Ukraina.ru took from the Fitch outlook, completely ignoring the sentence which follows: “These factors are balanced against improved policy credibility and coherence, the sovereign’s near-term manageable debt repayment profile and a track record of bilateral and multilateral support.”

The best method to refute such misrepresentation is to find and explore the full report.

Another manipulative claim by Ukraina.ru was that most Ukrainians are not interested in visa-free travel to the EU at all.

The source for this fake claim is a poll by the Democratic Initiatives Foundation conducted at the beginning of June 2018.

One of the questions was: “How important is the introduction of the visa-free regime with the EU-countries for you?” The results showed that 10 percent answered “very important,” 29 percent chose “important,” 24 percent said “slightly important,” 34 percent opted for “not important” and nearly 4 percent responded that it was difficult to say.

Only 34 percent said that visa-free travel to EU countries was not important.

But Russian media decided to combine “slightly important “and “not important” together, to produce the figure 58 percent, and claimed that the majority of Ukrainians were not at all interested in this opportunity.

However, when you add the numbers that answered “very important,” “important” and “slightly important,” you get 63 percent. This indicates that  63 percent of Ukrainians feel that visa free travel has some level of importance.

6.3 Invalid comparisons

Ukraina.ru published an article claiming that food prices in Ukraine have grown equal to those of Europe in February 2018. This claim is based on a Facebook post by former Ukrainian Prime Minister Mykola Azarov. Azarov’s claim was based on RIA Novosti’s data presented in an attractive but questionable infographic.

According to the Numbeo cost of living index, Ukraine is the least expensive country in Europe, along with Moldova, Macedonia and Albania. The site also compares prices of foodstuffs in various cities of the world, showing that Ukrainian prices on average have a long way to go before they reach European levels. So the comparison of absolute numbers without taking to account other indicators is incorrect, Stopfake determined.

Try to differentiate the real numbers and facts from the fake information. Very often, these kinds of fakes include some real numbers, plus numbers from suspicious and false sources.

In a Canadian example that has been circulated on social media since 2015, the conclusion appears to be that Canada spends more money on refugees than on pensioners.

It wasn’t true, according to Canadian government information. Some government-assisted refugees get a small monthly amount in their first year in Canada — about $800 for a single person — and a one-time set-up allowance of about $900. They may also get a loan of a few hundred dollars for rental or other deposits. There are sometimes small one-time allowances  for pregnant women, newborns and young children in school. But government-assisted refugees are required to pay back the cost of their trip to Canada and their initial medical exam — with interest.

Asylum-seekers in Canada get no social assistance until they are permanent residents, at which point they’re eligible for provincial social assistance just like anyone else. Privately sponsored refugees aren’t eligible for any social assistance — they are the financial responsibility of their sponsors for the duration of the sponsorship, which is usually about a year.

By contrast, single older Canadians in the lowest income bracket, get at least $1,300 a month through Guaranteed Income Supplements and Old Age Security pensions, the government website says.

And a 2004 study indicates the vast majority of refugees’ income is earned from employment, not social assistance, within seven years of their arrival in Canada. They sometimes perform better than business-class or family-class migrants.

But the myth persists and versions have sprung up in the US, Snopes found, as well as in Australia, according to ABC News. The highest-traffic page on the Canadian Council for Refugees group’s website debunks the myth.

A Brief Summary

Here’s what you need to pay attention to when reading public opinion polls and other research:

  • Is the methodology described?
  • How are the questions formulated? Sometimes they are devised to manipulate and suggest certain answers.
  • What is the sample of respondents: by age, place of residence and other characteristics? Is the sample statistically sound?
  • What is the reputation of the researcher?  Is he or she known in the professional community?
  • Who paid for the research? Serious research centers never hide their clients if they publicize the data.
  • Compare and contrast the result of research to other data and findings. If they are strikingly different, the results must be questioned.

A First-Aid Kit

These tips classify and describe the most common methods of deception with statistics and offer quick ways everyone can use to verify information. This is a “first-aid fact-checking kit” and also is a starting point to dig deeper in any of the directions that have been suggested.

 

Olga Yurkova is the co-founder of the Ukrainian fact-checking project StopFake and co-founder of the Forbidden Facts, an international project which aims to debunk fake news and teach people about the mechanisms behind it. StopFake covers media sources in 13 languages and conducts academic research on fake news and offers institutional training. Since launching in 2014, the organization has verified tens of thousands of articles, photos and videos, and revealed more than 3,000 misleading cases.