Minnesota wants to crack down on tech and social media giants. Tech companies are pushing back.

Proposals gaining bipartisan traction at the State Capitol would clamp down on the data collection, privacy settings and user preferences.

The Minnesota Star Tribune
March 22, 2024 at 1:00PM
Sen. Erin Maye Quade, DFL-Apple Valley, is carrying a bill that would limit the amount of data collected on children and implement rules on how it can be used. (Glen Stubbe/The Minnesota Star Tribune)

Minnesota lawmakers want to clamp down on data collection on children and limit how it’s being used by the world’s most powerful tech companies, which are facing growing scrutiny over social media platforms’ potential harm to kids.

They’re also pushing platforms to have default privacy settings for users while prohibiting them from prioritizing engagement over content that users explicitly didn’t ask for. The proposals are attracting fierce opposition from trade groups representing tech giants such as TikTok, Meta, Snapchat and X, which argue the bills will force them to censor content and run afoul of the First Amendment.

“This isn’t trying to regulate what’s on the internet, we’re trying to protect kids while they’re in the internet,” said Sen. Erin Maye Quade, DFL-Apple Valley, who is carrying several bills related to social media practices this year. “It’s not what’s on the internet that’s harmful, it’s what the tech companies are doing with the data that’s harmful.”

The strategy to target data collection and user settings is a shift from a broader attempt last year to ban use of algorithms on anyone under the age of 18. If successful this year, legislators are already anticipating lawsuits to strike down new state laws. An age-appropriate design law passed in California that’s similar to one of the bills moving in Minnesota is in the midst of ongoing litigation, after a district court judge blocked its implementation.

“An unconstitutional law protects no one, including kids,” Amy Bos, director of state and federal affairs at NetChoice, a trade company representing tech companies, told lawmakers at a hearing earlier this year. “Under the threat of fines from misjudging what may be considered potentially harmful to children, many platforms will certainly default to taking down all content on entire subjects, which is likely to remove beneficial, constitutionally protected material along with anything genuinely harmful.”

So far, Minnesota lawmakers haven’t been deterred by threats of litigation. They’re building on work started last year to criminalize certain deep fake images, video and audio content created by artificial intelligence. They’re proposing to increase those penalties this year, as well as extend the state’s child labor laws to the digital space, blocking social media users from making money off videos featuring children under the age of 14.

But tech trade groups have focused their lobbying power on the broader regulations being considered by legislators. Rep. Zack Stephenson’s proposal, called the “Prohibiting Social Media Manipulation Act,” is the first of its kind in the nation. He said it would require certain privacy features by default and force tech companies to prioritize the user’s preferences about what they want to see on their feeds above the need to keep them engaged.

User engagement on social media, which is run on algorithms, has become an important driver of revenue for tech companies.

“Right now they have every incentive just to prioritize stuff you will look at and keep your eyes engaged as long as possible,” Stephenson, DFL-Coon Rapids, said. “We want to put the user more in control.”

It’s been difficult as a part-time state legislator with limited staff to try to crack down on some of the biggest and richest companies in the world, but Stephenson said nothing meaningful has changed on the federal level.

“To the extent they’re even doing anything about social media, it’s trying to ban TikTok,” he said. “That is such a distraction to the real issue, in my mind, which is — what is this doing to all of our mental health, particularly kids.”

Multiple studies have raised the alarm about the link between social media and increases in depression and anxiety, especially among children. After bills stalled at the Capitol last year, lawmakers asked the attorney general’s office to study the impact of social media platforms on Minnesotans.

The report, released in February, found there’s “clear evidence that the specific choices of many technology platforms have caused harmful experiences for a substantial number of individuals, especially youth” and makes recommendations that are incorporated into Stephenson’s bill.

The Minnesota Age-Appropriate Design Code Act, which limits data collection on children from tech companies, comes with a cost to the state, which could jeopardize its chances of passage in a non-budget year in St. Paul. It’s also facing criticism from trade groups representing tech companies that say the requirement that companies act in the “best interests of children” will lead to subjective and diverging interpretations.

“Different companies, even parents in one household, will have very different interpretations of what is and isn’t in the ‘best interests’ of children, or attorneys general in different states who enforce this law,” the trade group TechNet submitted in a letter in opposition to the bill.

Maye Quade, who is sponsoring the measure, said she knows it’s “hard to draw lines around the state in regards to the internet.” But she said tech companies should take note of the bipartisan support in Minnesota to do something about the lack of regulations around these companies.

“There are physical world protections for things that kids use, we just don’t have those for the internet,” she said. “As lawmakers, there’s a very urgent imperative to take action.”

about the writer

about the writer

Briana Bierschbach

Reporter

Briana Bierschbach is a politics and government reporter for the Star Tribune.

See More

More from Politics

card image

Our mission this election cycle is to provide the facts and context you need. Here’s how we’ll do that.