Nothing seemed off about Rep. Zack Stephenson's testimony to a House committee on why the Minnesota Legislature should crack down on the use of so-called deep fake technology.
Using artificial intelligence, the technology can manipulate audio and video to create lifelike recreations of people saying or doing things that never actually happened, he said. Deep fakes have been used to create sexually explicit videos of people or fake political content intended to influence elections.
Then, the Coon Rapids Democrat paused for a reveal. His comments up to that point were written by artificial intelligence software ChatGPT with a one-sentence prompt. He wanted to demonstrate "the advanced nature of artificial technology."
It worked.
"Thank you for that unsettling testimony," responded Rep. Mike Freiberg, DFL-Golden Valley, chair of the Minnesota House elections committee.
The proposal represents a first attempt from Minnesota lawmakers to clamp down on the spread of disinformation through the technology, particularly when it comes to influencing elections or in situations where it's used to distribute fake sexual images of someone without the person's consent.
It's already a crime in Minnesota to publish, sell or disseminate private explicit images and videos without the person's permission. But that revenge porn law was written before much was known about deep fake technology, which has already been used in Minnesota to disseminate realistic — but not real — sexual images of of people.
Stephenson's bill would make it a gross misdemeanor to knowingly disseminate sexually explicit content using deep fake technology that clearly identifies a person without permission.