It’s been the for elections in human history: 2024 is a ” ” year in which 3.7 billion eligible voters in 72 countries had the chance to go the polls. These are also the , where many feared that deepfakes and artificial intelligence-generated misinformation would overwhelm the democratic processes. As 2024 draws to a close, it’s instructive to take stock of how democracy did.
In a Pew survey of Americans from earlier this fall, nearly eight times as many respondents in the 2024 election as those who thought it would be used mostly for good. There are real concerns and risks in using AI in electoral politics, but it definitely has not been all bad.
The dreaded ” ” has not materialized – at least, not due to AI. And candidates are eagerly adopting AI in many places where it can be constructive, if used responsibly. But because this all happens inside a campaign, and largely in secret, the public often doesn’t see all the details.
Connecting with voters
One of the most impressive and beneficial uses of AI is language translation, and campaigns have started . Local governments in and and prominent politicians, including India Prime Minister and New York City Mayor , used AI to translate meetings and speeches to their diverse constituents.
Even when politicians themselves aren’t speaking through AI, their constituents might be using it to listen to them. Google rolled out free translation services for an additional this summer, available to billions of people in real time through their smartphones.
Other candidates used AI’s conversational capabilities to connect with voters. U.S. politicians , and deployed chatbots of themselves in their presidential primary campaigns. The fringe candidate beat Joe Biden in the American Samoan primary, at least partly thanks to using AI-generated emails, texts, audio and video. Pakistan’s former prime minister, , used an AI clone of his voice to deliver speeches from prison.
Perhaps the most effective use of this technology was in Japan, where an obscure and independent Tokyo gubernatorial candidate, , used an AI avatar to from voters and managed to come in fifth among a highly competitive field of 56 candidates.
Nuts and bolts
AIs have been used in political fundraising as well. Companies like and market AIs to help draft fundraising emails. Other AI systems help candidates with . It’s notoriously difficult to measure the impact of these kinds of tools, and political consultants are cagey about what really works, but there’s clearly interest in continuing to use these technologies in campaign fundraising.
Polling has been highly mathematical for decades, and pollsters are constantly incorporating new technologies into their processes. Techniques range from using AI to distill voter sentiment from social networking platforms – something known as ” ” – to creating that can answer tens of thousands of questions. Whether these AI applications will result in more accurate polls and strategic insights for campaigns remains to be seen, but motivated by the ever-increasing challenge of reaching real humans with surveys.
On the political organizing side, AI assistants are being used for such diverse purposes as , , and helping and get-out-the-vote efforts. In Argentina in 2023, both major presidential candidates to develop campaign posters, videos and other materials.
In 2024, similar capabilities were almost certainly used in a variety of elections around the world. In the U.S., for example, a Georgia politician to produce blog posts, campaign images and podcasts. Even standard productivity software suites like those from Adobe, Microsoft and Google now integrate AI features that are unavoidable – and perhaps very useful to campaigns. Other AI systems help looking to run for higher office.
Fakes and counterfakes
And there was AI-created misinformation and propaganda, even though it was not as catastrophic as feared. Days before a Slovakian election in 2023, discussing election manipulation went viral. This kind of thing happened many times in 2024, but it’s unclear if any of it had any real effect.
In the U.S. presidential election, there was a lot of press after a robocall of a told New Hampshire voters not to vote in the Democratic primary, but that didn’t appear to make much of a difference in that vote. Similarly, AI-generated images from hurricane disaster areas didn’t seem to have much effect, and neither did a stream of or misrepresenting candidates’ actions and seemingly designed to prey on their political weaknesses.
AI also played a role in protecting the information ecosystem. OpenAI used its own AI models to aimed at sowing division before the U.S. presidential election. While anyone can use AI tools today to generate convincing fake audio, images and text, and that capability is here to stay, tech platforms also use AI to automatically like hate speech and extremism. This is a positive use case, making content moderation more efficient and sparing humans from having to review the worst offenses, but there’s room for it to become more effective, more transparent and more equitable.
There is potential for AI models to be much more scalable and adaptable to more languages and countries than organizations of human moderators. But the implementations to date on platforms like Meta demonstrate that a lot needs to be done to make these systems fair and effective.
One thing that didn’t matter much in 2024 was corporate AI developers’ prohibitions on using their tools for politics. Despite market leader OpenAI’s emphasis on and its use of AI to automatically to generate images of political candidates, the company’s enforcement and actual use is widespread.
The genie is loose
All of these trends – both good and bad – are likely to continue. As AI gets more powerful and capable, it is likely to infiltrate every aspect of politics. This will happen whether the AI’s performance is superhuman or suboptimal, whether it makes mistakes or not, and whether the balance of its use is positive or negative. All it takes is for one party, one campaign, one outside group, or even an individual to see an advantage in automation.