We are entering an era of infocalypse. Disinformation is taking on a form of its own, learning from artificial intelligence and inserting itself not only into text, but videos too. This era will increasingly divide families and polarise politics. It will benefit malicious actors, such as those who this week produced fake videos of missiles flying into the port of Beirut prior to its catastrophic explosion on Tuesday, to increase tensions between Lebanon and Israel. It is, in the words of American comedian and writer Jordan Peele, a “f****d up dystopia.”
Nina Schick explores this and more in her new book, Deep Fakes and the Infocalypse. Schick begins with the devil she knows best: disinformation from the Russian regime. Over the last decade, Putin has become a master of the Infocalypse, utilising the combination of mainstream media outlets, social media bots, and hackers to muddle reality. The 2014 invasion of Eastern Ukraine simply didn’t happen, according to Russia Today, and Moscow’s bots will be damned if you dare link Russian equipment to the downing of Malaysia Airlines Flight 17.
These were incredibly effective campaigns – for years, respectable western analysts debated whether Russia was right. The same tools have also been deployed to wreak havoc in our own countries. Schick notes that Emanuel Macron’s presidential campaign, which she worked on, was aggressively targeted two days before the election by the same Russian hackers believed to be behind the hack of Clinton’s campaign a year earlier. Russia’s ‘Internet Research Agency’ incited culture wars during the last American presidential campaign by creating Facebook pages both for and against the Black Lives Matter movement.
The information threat is increasing from further east, too. China is beginning to show an ‘aggressive’ new interest in infiltrating Western platforms such as Facebook and YouTube. Its immediate response to the coronavirus pandemic was a campaign to sow confusion around the origins of the virus, which included Zhao Lijian, the Foreign Ministry spokesperson, sharing an article that claimed the virus had originated in America. These belligerent lies arguably backfired on Beijing, but the general sense of confusion it contributed to has set back public health efforts across the west.
Yet the most striking chapter in Schick’s book isn’t about foreign actors, but ourselves. We are tearing our own societies apart, with each side of our increasingly polarised societies utilising the infocalypse to troll the other. Trump’s rise to the presidency may have been aided by foreign interference, but much of the disinformation came from his own Twitter account. Today, he continues to post lies and conspiracy theories on an almost daily basis. America, Schick says, may be moving to a “tipping point”, wherein democratic systems and society can no longer cope with disinformation.
That tipping point may come sooner rather than later. The pandemic means this year’s US presidential election will be conducted largely via the mail-in ballot system, which in some states takes weeks to count. Trump will therefore have ample time to sow doubt about the legitimacy of the result. You can imagine the tweet a week out from the voting deadline: “Wow! Video shows ballot box being stolen from the back of a van! Another election???” The video would probably be a “cheap-fake”, a miscontextualised or badly-edited clip, but it would nonetheless be coming from the account of the most powerful man in America, whose agenda-setting powers are second to none. The subsequent denunciations alone would cast doubt on the electoral system.
There is more bad news still. The problems coming down the road are of an order of magnitude more dangerous. Artificial Intelligence is not an insecure 70-year-old tweeting from his presidential toilet seat – it’s a supercomputer. It can quickly learn new ways to convince people of a lie, and it already has the power to generate wholly synthetic media, or “deepfakes”, as demonstrated by this clip of Barack Obama saying “President Trump is a total and complete dipshit.” Imagine the carnage if a scandalous deepfake were to go viral in the final days of a political campaign.
Yes, this book is thoroughly depressing. But not all is lost. Schick provides a list of actions that we as individuals can take to improve the information environment. The first line of defence is to learn about what is happening, to understand what malicious state actors are doing with disinformation resources, and to know just how convincing disinformation can be. The second is to counter-act with accurate information, to adopt supportive technical tools, and to legislate adequate regulations on Artificial Intelligence. And finally, we have to fight. Media companies and social media websites need to collaborate and call out disinformation campaigns. A good example of this is the combined work of CNN, Graphika, Facebook, and Twitter in exposing Operation Double Deceit, the Russian-influenced disinformation operation in Ghana.
Schick leaves us with a hopeful note. The forces to fight the Infocalypse are already coming together and growing in strength, she says. This is an important book, forensically exposing the nature of the threat.