Authored by Naveen Anthrapully via The Epoch Times,
German President Frank-Walter Steinmeier has warned about artificial intelligence turning into a threat to democracy, calling for better digital literacy among the public to be able to identify AI-manipulated content.
“Neither the AI nor the companies operating it are democratically elected,” Steinmeier said while delivering his remarks during a recent event in Berlin, according to Bloomberg.
“Citizens should be better equipped to scrutinize answers provided by AI, and recognize when artificial intelligence technologies have manipulated images or text. This is important since disinformation can now be generated and spread quickly, triggering confusion among the public,” the German president noted.
While liberal democracies may identify potential dangers of AI, there are authoritarian powers that are seeing these developments in a different light, he stated. “That’s all the more reason for us to be clear in our stance.”
Steinmeier called for societies to develop legal and ethical frameworks to monitor AI.
“We’ve been warned that potentially uncontrollable risks are coming our way … And that deserves our attention.”
The German president’s comments come after Sam Altman, the CEO of OpenAI, which is behind the AI chatbot ChatGPT, admitted back in May that artificial intelligence could pose a threat to democracy.
During a hearing before the U.S. Senate Committee on the Judiciary, Altman raised concerns that AI could be used to spread misinformation during elections. “My worst fears are that we—the field, the technology, the industry—cause significant harm to the world. I think that can happen in a lot of different ways,” he said.
Digital Authoritarianism via AI
During the World Movement for Democracy’s 11th Assembly last year, Eileen Donahoe, a board member for the National Endowment for Democracy (NED) pointed out that artificial intelligence has “turbo-charged” pre-existing forms of repression, providing authoritarian regimes with new “social engineering tools” necessary to shape the behavior of citizens.
“It’s important to recognize digital authoritarianism as a phenomenon that’s about much more than the repressive application of tech,” she said.
“We have to see it as an alternative model of governance that’s spreading around the world, competing with democracy.”
Some experts worry that AI-fueled misinformation could become prevalent in the 2024 U.S. presidential race, with videos, images, and text generated via artificial intelligence used to sway and influence the public.
Such misuse of AI is already happening. A Twitter account called DeSantis War Room recently shared a video showing several images of Trump and Fauci, three of which depicted Trump hugging and kissing Fauci on the face with the caption “REAL LIFE TRUMP,” suggesting a close relationship.
However, the three images showing Trump hugging and kissing Fauci were later identified as fake and AI-generated.
In a May 3 statement, the American Association of Political Consultants (AAPC) condemned the use of “deceptive generative AI content” in political campaigns, calling it a “troubling challenge” to free and fair debate on political ideas.
AI Targeted Manipulation
In a Jan. 4 Epoch Times commentary, Anders Corr, a principal at Corr Analytics Inc, warned that artificial intelligence is pushing the United States and the world to the “cusp of the greatest threat” to democracy and human agency.
“AI-enabled tech will be capable of surveilling, micro-targeting, and influencing democratic populations in ways that were previously impossible through traditional state monitoring and privately-developed social media algorithms,” he said.
“AI will be able to find minimum winning coalitions in n-dimensional political space to determine short- and medium-term political goals, for example, and then influence those populations through AI content production.”
Artificial intelligence advantages existing power centers, he noted. Since people who are already in power will benefit greatly from AI technologies, this will result in further centralization of power, Corr predicted.