Opinion editor’s note: Editorials represent the opinions of the Star Tribune Editorial Board, which operates independently from the newsroom.
‘Deepfakes’ require a real federal response
Klobuchar’s bipartisan bills can augment states’ efforts.
•••
The stakes of November’s election are real.
Campaign communications should be, too.
Yet in this era of ever-accelerating artificial intelligence, the manipulation of audio and/or video — and, most profoundly, the manipulation of voters — is an existential threat to our democracy.
And it’s not enough to depend on self-regulation from media entities. That much was clear when, on July 26, X (Twitter) owner Elon Musk reposted a manipulated campaign ad from Vice President Kamala Harris that had her making damaging statements about outgoing President Joe Biden (including referring to “his senility”) as well as her own merits as his potential replacement (including that she is a “diversity hire”).
While the original altered video contained a disclaimer that it was a parody, the version Musk reposted to his 192 million followers did not. Instead, in what experts say is a clear violation of X’s (and thus Musk’s) policies, he wrote “This is amazing” and added laughing emojis on his repost. It wasn’t until days later that Musk — who has endorsed Harris’ opponent, former President Donald Trump — amended his repost to indicate it was a parody.
Amazing, indeed. But not in the way Musk means. And it’s no laughing matter.
“This video illustrates the kind of problem that’s going to become commonplace unless we put protective rules in place,” Robert Weissman, co-president at Public Citizen, a nonpartisan consumer advocacy organization, told an editorial writer. Noting the technical quality of the altered video, Weissman said that “we can expect convincing deepfakes that will mislead, deceive and defraud voters going forward and potentially create electoral chaos.”
Twenty states, including Minnesota, have codified rules about deepfakes, said Weissman. But, he said, Washington needs to get more aggressive. “The states are acting. But we really need federal action.”
Minnesota Sen. Amy Klobuchar, a Democrat, agrees. “Democracy functions on information so voters can make a decision,” but the basis for that decision “gets upended and turned on its head” through deepfakes and other manipulation, she told an editorial writer.
While the Federal Election Commission and the Federal Communications Commission and the uneven protocols of social-media self-regulation offer some help, clear and codified laws are needed, Klobuchar said.
Accordingly, she’s worked across the aisle to get Republicans to boost two bills. The “AI Transparency in Elections Act,” co-sponsored by Sen. Lisa Murkowski, R-Alaska, would “require disclaimers on ads with images, audio or video that are substantially generated by artificial intelligence.” The bill also requires that the FEC address violations quickly.
The second bill, co-sponsored by her Democratic colleagues Chris Coons of Delaware and Michael Bennet of Colorado, plus Republican Sens. Josh Hawley of Missouri and Susan Collins of Maine, is the “Protect Elections from Deceptive AI Act,” with the intent to “ban the use of artificial intelligence to generate materially deceptive content falsely depicting federal candidates in political ads to influence federal elections.”
Last week, Klobuchar took to the Senate floor to try to pass these bills with unanimous consent. Despite the dangers to both sides of the aisle, let alone democracy and the DNA of our country, the bills did not pass. She’ll press on when Congress reconvenes after the August recess.
That’s good news for every American. As technological transformations and the malevolence of homegrown and offshore actors evolve, laws to protect democracy need to evolve as well.
Now that Gov. Tim Walz’s vice presidential bid has ended, there’s important work to do at home. Reinvigorating that “One Minnesota” campaign is a must.