By Jack Kelly
Wisconsin Watch
Forward is a look ahead at the week in Wisconsin government and politics from the Wisconsin Watch statehouse team.
Lawmakers in the Wisconsin Assembly are poised to vote Thursday on legislation that would require a disclosure when campaign communications feature content generated by artificial intelligence.
Modeled after a proposal enacted in Washington state, the bipartisan bill would require political communications to display disclaimers noting content is AI-generated whenever “synthetic media” — audio or video — is being featured in a video ad. For audio ads, the phrase “contains content generated by AI” would be required to be played at both the beginning and end of the communication.
A Senate committee held a public hearing on the legislation last week. Under Senate rules, once the Assembly votes on the bill on Thursday, it can be considered on the Senate floor without a committee vote, expediting the process.
The explosion of generative AI — a type of AI that is prompted by a user and generates text, audio, video, etc., in response — has spurred lawmakers to take action on the bill before the 2024 election cycle heats up in Wisconsin, said Rep. Clinton Anderson, D-Beloit, one of the bill’s co-authors.
“No matter what side of the aisle you’re on, you’re probably going to want to know what’s real and what’s not,” Anderson told Wisconsin Watch in an interview. “AI has not become a partisan issue yet. So it’s really easy to work across the aisle when no lines have been drawn yet.”
That sentiment was echoed by Rep. Scott Krug, R-Nekoosa, who chairs the Assembly Committee on Campaigns and Elections. “It was probably one of the smoothest bills we did all session,” he told Wisconsin Watch.
Krug added that the legislation would help head off certain problems that could arise if AI-generated content was perceived as being true. He noted that over the last three years, many concerns about the 2020 presidential election in Wisconsin have been rooted in misconceptions. The bill would help combat people putting too much stock in generated content, he said.
Both lawmakers said the legislation was a good first step in responding to the spread of AI, especially as the technology’s capabilities grow. However, both noted their work is not finished. Krug said the committee will continue to meet throughout the summer — even as most lawmakers return to their districts to campaign — to stay on top of new developments. Anderson acknowledged the legislation could need updating in the future.
But even as policymakers begin to take action, AI researchers told Wisconsin Watch the technology could still cause headaches in 2024 — especially in local legislative races and for local election officials.
There have already been some high-profile examples of AI being tapped to try to influence elections, both in the United States and overseas. Last month, a robocall to New Hampshire voters featured the AI-generated voice of President Joe Biden telling them not to vote in the state’s presidential primary election and to instead “save your vote for the November election.” In September 2023, an AI-generated recording sounding like a leading Slovakian political figure claiming to have rigged the election was circulated two days before the nation’s parliamentary elections.
“Everything that we think about at the national level is, I think, more alarming at the local level,” said Valerie Wirtschafter, a fellow at the Brookings Institution who studies artificial intelligence and democratic systems.
For example, people generally know what the president’s voice sounds like, but far fewer have ever heard their local representatives speak. That makes local races more vulnerable to AI-generated deepfakes — as well as influence campaigns and voter suppression schemes, Wirtschafter said.
Zeve Sanderson, executive director of New York University’s Center for Social Media and Politics, a research lab that studies the relationship between politics and policy and new technologies, also said AI’s influence could be greater at the local level.
“When it comes to something like a Joe Biden deepfake, it was identified over the course of minutes, maybe, (at the) longest, hours,” Sanderson told Wisconsin Watch. “And that’s not going to be the case for more local elections.”
Wirtschafter and Sanderson also expressed concerns about how AI could be used to target election systems.
The technology can craft sophisticated phishing attacks that target election officials, convincingly impersonate election vendors or officials, and flood election officials with information and records requests, Sanderson said. One or a combination of such uses could compromise or gum up election systems.
It’s critical for clerks to stay up to date on cybersecurity best practices so they are prepared to counter any potential bad actors, Wirtschafter said, adding that there are resources available for election officials to reference as they prepare for 2024.
“Artificial intelligence undoubtedly raises concerns about its ability to perpetuate inaccurate election information in 2024 and beyond,” said Wisconsin Elections Commission spokesperson Riley Vetterkind.
The elections commission is working “with our state and federal partners, including the Cybersecurity and Infrastructure Security Agency, to ensure the WEC adopts best practices for how to address the threat that AI may pose,” he added in a statement to Wisconsin Watch.
The decentralized nature of Wisconsin’s elections — where 1,850 municipal clerks and 72 county clerks are tasked with administering voting — could make them more vulnerable to the aforementioned attacks.
“That’s the stuff that keeps me up at night,” Sanderson said. “Not the Joe Biden deepfake.”
This story was originally published by Wisconsin Watch at wisconsinwatch.org.