AI deepfakes could decide the election in November, experts warn, after just 44,000 votes handed Joe Biden the presidency in 2020

Julius van de Laar (left) and Craig Oliver (right) of FGS Global
Political consultants Julius van de Laar (left) and Craig Oliver fear AI could sway close elections.
Joe Maher—Fortune Brainstorm AI

The 2020 election swung in Joe Biden’s favor by just 44,000 votes in three swing states, a margin thin enough to elicit fears that deepfakes spread by generative artificial intelligence could alter the outcome of the next U.S. presidential election, in 2024. 

Although Biden won the popular vote by more than 7 million in 2020, had Donald Trump picked up Arizona, Georgia, and Wisconsin—the three states he lost to Biden by 44,000 votes—the electoral college would have been tied, sending the election to be decided by the House of Representatives.

In the context of such a razor-thin margin and with only around 200 days until the Nov. 5 vote, campaigns will be pouring millions into marginal voters in these states and others that could potentially flip from blue to red or vice versa.

“This is going to be an incredibly tight campaign, and it probably will come down to just a few votes in just a few counties in just a few swing states, so AI might just make the difference,” said Julius van de Laar, political strategist and former Obama campaign aide.

Speaking at the Fortune Brainstorm AI conference in London on Tuesday, van de Laar and other experts warned that generative AI could easily sway enough Americans so long as disinformation was targeted in a handful of key battlegrounds.

Van de Laar said fake robocalls could be created using ElevenLabs products with almost no effort for free.

Craig Oliver, the former communications director for 10 Downing Street during David Cameron’s premiership, said 95% of the conversation around AI and its impact on politics is around deepfakes.

“The problem is this space is almost completely unregulated. You’re in a situation where there are no rules. And the truth about politics is if there isn’t a rule someone will say, ‘Well, my opponent will use it if I don’t’,” said Oliver, now the global co-head of strategy and reputation at communications consultancy FGS Global. “It becomes a race to the bottom.”

The former Cameron advisor argued society fragmenting into deeply tribal factions provided fertile ground for bad-faith actors.

These could use AI to reframe the narrative and present what have since come to be known as “alternative facts” to a constituency all too willing to believe them. 

Deliberately spreading disinformation to influence public opinion is nothing new. It has already been successfully employed for decades, for example, by the tobacco industry, which sought to sow doubt over whether smoking actually caused cancer. 

Steve Bannon, a chief strategist in the Trump administration, had his own catchphrase for it: “Flood the zone with shit.” 

Unfortunately, there is no way to guarantee AI doesn’t influence the outcome of the election, according to Tara McGuinness, a former senior advisor to President Obama. 

“In the short run you can’t,” replied the founder and executive director of New Practice Lab. “If you have no quality control, which I would argue we don’t, then you have the potential to amplify misinformation.”