- cross-posted to:
- technology@lemmy.world
- worldnews@lemmygrad.ml
- cross-posted to:
- technology@lemmy.world
- worldnews@lemmygrad.ml
AI models can meaningfully sway voters on candidates and issues, including by using misinformation, and they are also evading detection in public surveys according to three new studies.
Scientists are raising alarms about the potential influence of artificial intelligence on elections, according to a spate of new studies that warn AI can rig polls and manipulate public opinion.
In a study published in Nature on Thursday, scientists report that AI chatbots can meaningfully sway people toward a particular candidate—providing better results than video or television ads. Moreover, chatbots optimized for political persuasion “may increasingly deploy misleading or false information,” according to a separate study published on Thursday in Science.
“The general public has lots of concern around AI and election interference, but among political scientists there’s a sense that it’s really hard to change peoples’ opinions, ” said David Rand, a professor of information science, marketing, and psychology at Cornell University and an author of both studies. “We wanted to see how much of a risk it really is.”



The last federal election cycle was a constant deluge of very bad renderings of Trump saving babies from fires and all the old ladies and Joe Rogan listeners had no idea what was real or not, because the prior scourge, the attention-span-eroding algorithm has trained people to not look at anything for more than a half second.
These things aren’t even necessarily designed or used to make people think that Trump is actually a beefcake riding a tank, it’s supposed to make everyone doubt what’s real or not. That’s all they have to do, is get people to tune out and doubt everything they read and hear as “AI.” That’s the real power of this shit, it makes us not want to be involved because it forces people to triple check everything we see and hear.
And this is exactly what exit polling in 2024 implied.