Skip to main content

Wyoming lawmakers divided on AI governance laws

News Letter Journal - Staff Photo - Create Article
By
Hannah Shields with the Wyoming Tribune Eagle, via the Wyoming News Exchange

CHEYENNE — Lawmakers seemed divided this week over how to legislate AI governance in Wyoming, with some committee members arguing that overly broad legislation will impede on First Amendment rights of free speech.

Members of the Legislature’s Select Committee on Blockchain, Financial Technology and Digital Innovation Technology continued discussion from their May meeting on Senate File 51, “Unlawful dissemination of misleading synthetic media.”

The bill was drafted last interim and made it halfway through the 2024 budget session before dying in the House of Representatives.

SF 51, as written, prohibits the distribution of synthetic media, or deepfakes, with the intent to deliberately mislead people and spread misinformation.

Deepfakes are digitally altered content through the use of AI technology to manipulate a person’s image or likeness.

The bill requires a posted disclaimer with any digitally altered content using AI technology.

However, it’s difficult to legislate defamation under the context of free speech.

The question committee members asked themselves on Monday is what to do when a person’s free speech is impeded by another person’s speech.

Cheyenne accountant David Pope, who serves as one of the governor’s liaisons to the committee, said AI technology can create videos or voice memos of people saying things they didn’t actually say in real life.

This kind of digitally altered content could ruin a person’s reputation or career, he said.

“I believe that if I am saying something, I have a right to say it, even if someone else doesn’t agree with that,” Pope said. “But that someone else cannot, and should not, be able to manipulate my voice … to further their own ends.”

Committee co-Chair Sen. Chris Rothfuss, D-Laramie, is adamantly supportive of the bill. He said on Monday, and in previous meetings, that AI governance laws in other states contain too many loopholes and lack the “teeth” to be effective.

His concern in drafting legislation for Wyoming, Rothfuss said, is passing another bill that will be just as ineffective. However, Sen. Affie Ellis, R-Cheyenne, who has a background in law and political science, said the bill is too broad and mixes in too many legal areas.

“We’re mixing theories of law that don’t always mix,” Ellis said. “And for politicians’ defamation, we’re held to a lower standard. I mean, it’s really hard for a politician to say ‘This is defaming me.’”

Her argument is to narrow the bill to a specific component, such as regulating campaigns. Sen. Tara Nethercott, R-Cheyenne, who is a practicing attorney and supporter of SF 51, said there should be a focus on protecting the public, rather than political candidates.

“I certainly care about the candidate deeply. But I think when we’re talking about campaigns, what we’re really concerned about is harm to the public,” Nethercott said.

Republican Rep. Ocean Andrew remained opposed to SF 51 and the concept of regulating deepfakes as a whole. The Laramie representative said any laws regarding AI governance would unconstitutionally hinder a person’s freedom of speech.

“Free speech is potentially dangerous, but it’s something that we tolerate, regardless, and there are risks that we have to accept there,” Andrew said. “People are going to have to slowly learn to be more skeptical of the information they’re finding online. I’m not sure it’s the state’s job to get involved in that process.”

Co-Chair Rep. Cyrus Western, R-Big Horn, Nethercott and Pope all expressed support of the bill and the concept behind it. Reps. Mike Yin, D-Jackson; Daniel Singh, R-Cheyenne; and Ellis said SF 51 is too broad, but added they would support a bill narrowly tailored to campaigns for elected office.

Liability of platforms, distributors

Another aspect of Monday’s conversation is whether platforms, such as social media, should be held liable for the dissemination of deepfake media.

TechNet representative Ruthie Barko told lawmakers social media platforms should be let off the hook for the spread of deepfakes. TechNet is a national, bipartisan group that works with policymakers at the federal, state and local levels to advocate for America’s tech industry, including AI tools and technology.

Barko compared the situation to a fraudulent charge on a bank account.

“(Say) your bank doesn’t find in time that they flagged fraudulent activity on your account,” Barko said. “They take care of that when you flag it, they take the transaction down. But does the bank actually need to be liable for the fact that that transaction happened?”

Project Citizen spokesperson Ilana Beller told lawmakers distributors should be held accountable for intentionally misleading deepfakes, since it’s not always easy to track down the creator.

“Also, it’s not clear that in all cases, the creator would want what they created to be disseminated,” Beller said.

Both Beller and Barko made references to an incident that happened in New Hampshire, in which residents received a robocall impersonating President Joe Biden earlier this year, two days before the state’s primary election. According to an NPR article, the voice behind the robocall told voters not to participate in the upcoming primary, but to wait to vote in November for the presidential election.

New Hampshire’s attorney general was able to track the robocall to a Texas telemarketing company, the article reported, and investigated it for illegal voter suppression. A cease and desist letter was issued by the Federal Communications Commission’s Enforcement Bureau to the Texas company on Feb. 6. The letter included definitions for “deepfake” media and outlined violations of law committed by the company.

“I think when a state at least starts codifying the legal terms and the definitions and enforceability around these cognizable harms, that is an excellent step,” Barko said.

Beller said there are current laws in the space of deepfakes that make it easy to prosecute cases of illegal voter suppression, such as the one in New Hampshire. However, current laws don’t cover a majority of the issues of misinformation spread by deepfake media, she added.

Committee’s next steps

Rothfuss proposed holding onto SF 51 for discussion at the committee’s next and final scheduled meeting, with a couple of changes: one, change the penalty in the bill from criminal to civil, and two, add an exemption for satire and parody in the bill.

He also suggested creating a bill parallel to SF 51 that is narrowly tailored to campaigns for elected office, per discussion with other committee members. These proposals were met without any objection from the rest of the committee, although Andrew remained firm in his position against any AI governance legislation.

The committee will meet Sept. 16-17 at the University of Wyoming in Laramie. Information on the meeting and how to view it via livestream is available at wyoleg. Gov.

This story was published on July 6, 2024.

--- Online Subscribers: Please click here to log in to read this story and access all content.

Not an Online Subscriber? Click here for a one-week subscription for only $1!.