This post originally appeared on RUSI.org on 11 March 2020
The most seductive conspiracy theories mix elements of the truth with hyperbole and inaccuracy. Disinformation campaigns which aim to exploit public health concerns pre-date the Internet. ‘Operation Infektion’ was a Soviet disinformation campaign in the 1980s which claimed that the US military was behind the AIDS pandemic. The techniques of public deception and manipulation used by both the US and Soviet Union during the Cold War are clearly visible in coronavirus disinformation.
Successful disinformation campaigns mix ‘lies and half-truths with undeniable facts’. For example, Vladimir Zhirinovsky, a Russian member of parliament, has claimed that coronavirus is a US-made bioweapon designed to win the ongoing trade war with China. Zhirinovsky also made links between the US pharmaceutical industry and the spread of the virus – a common trope of conspiracy theories.
Conspiracy theories also draw false links between unrelated events. Topcor, a Russian news site, drew on US Congressman Chris Smith’s request for the release of documents relating to Department of Defense experiments with ticks and insects to imply that they were being used for biological warfare. By trying to draw a causal link, based on a complete lack of evidence, between these experiments and coronavirus, Topcor attempts to create an alternative set of facts.
Another way to start a conspiracy theory is by asking fabricated questions which rely on incorrect premises and invite the target audience to make irrelevant deductions or draw particular conclusions. Katyusha, another Russian news site, makes the argument that virus outbreaks are often caused by secret American laboratories, which are experimenting on people. Katyusha claims that the US does not acknowledge that it conducts such experiments – meaning such conspiracy theories are unfalsifiable. But this does not matter, for the purpose here is merely to plant an idea and sow doubt in an audience.
Conspiracy theories also rely on repetition to spread. This task is much easier to accomplish today than it was in the 1980s, as networks of bots spread the same (or very similar) messages repeatedly over social media. As such, these bots – often automated – are very cheap to maintain, and overall present an effective way of spreading messages over social media. ‘Thousands of Russian-linked social media accounts’ have already been identified as spreading and coordinating in ways to disrupt public health efforts.
These bots will be using networks that have been previously established for past disinformation campaigns. They are able to segment users on social media based on how susceptible they are to other conspiracy theories. For example, users that have previously interacted with Russian bots over vaccine disinformation are likely to be targeted again.
For the moment, coronavirus conspiracies serve two fundamental purposes. The first is to discredit the US as a whole, to paint it as a global spreader of disease and pain, just like with Operation Infektion. The second is to damage the reputation of agencies used to fight epidemics, especially domestic US agencies like the Centre for Disease Control.
This is not the first time that disinformation has been used to create a public health risk. Recently, anti-vaccination disinformation campaigns have seen huge success in stopping citizens from getting vaccines against preventable but devastating diseases. Public health campaigns are starting to shift, from providing citizens with facts about the efficacy of vaccines to providing emotional narratives to help drive change – tactics which may prove effective in fighting coronavirus disinformation.
Social media companies are already being forced to respond to the conspiracy theories. Facebook is banning adverts that ‘promise to cure coronavirus or incite panic around the outbreak’. The World Health Organization (WHO) is working directly with Google to help stop the spread of disinformation. Google can artificially change search rankings to suppress results which contain information that is damaging to public health. Google will be prioritising information from the WHO, even if this information is not optimised for Google, and would not normally be listed as highly. The NHS has also been working directly with Twitter to help identify misinformation. These are important public health steps.
The UK government has set up a special ‘cross-Whitehall’ unit in the Department for Digital, Culture, Media and Sport. Although the unit will initially be looking into disinformation coming from Russia and China, it is likely that the scope will widen to include other countries that have been known to use similar tactics in the past, such as Iran.
These measures can help make a significant impact on how coronavirus conspiracy theories are propagated. But the most effective inoculators against such conspiracy theories is still the public at large, who should remain aware of malicious sources of information.