‘Vermont Kids Code’ would ban big tech from collecting children’s data in harmful ways

The proposed Senate bill aims to protect Vermont kids from big tech companies. Photo courtesy Centers for Disease Control and Prevention

But companies like Meta and TikTok would be responsible for assessing their data protection policies under the bill — and for determining whether they are in compliance with the law.

by Brooke Burns, Community News Service Documents in Vermont Attorney General Charity Clark’s lawsuit against Meta reveal that even with around 80% of Vermont teens using Instagram in 2020 — one of the highest rates in the country — the company was still working on ways to get those teens to spend more time per day on the app. 

It’s amid those revelations that a new Senate bill aims to protect Vermont’s kids from predatory data collection and online content designed to take advantage of their vulnerabilities.

Colloquially called the Vermont Kid’s Code, the bill, S.289, would ban big tech companies from collecting kids’ data, or designing their products, in ways that would create a reasonable risk of material physical harm, severe emotional distress, financial harm, a highly offensive intrusion to the expectation of privacy — or in any way that would discriminate based on a protected class.

The ban would apply to people or companies that collect people’s data, or have it collected; operate in Vermont; and meet at least one of three criteria: having more than $25 million in gross annual revenue; annually buying, receiving, selling or sharing data of more than 50,000 households, devices or people; and deriving more than 50% of annual revenues from user data sales.

But companies like Meta and TikTok would be responsible for assessing their data protection policies under the bill — and for determining whether they are in compliance with the law.

Lead sponsor Sen. Kesha Ram Hinsdale, D-Chittenden Southeast, said in an interview that the bill is modeled after legislation in the European Union and United Kingdom that has proven successful. 

“Much of the bill is modeled after the policy in the United Kingdom, and the reason for that is not because we are assuming that these companies are inherently out to get our kids, but that they are already meeting a lot of these standards in other countries, in the EU, in the UK,” Ram Hinsdale said. “So as much as they fight these design and code principles here, they’re meeting them elsewhere, so they know exactly what we’re asking for.”

Ram Hinsdale also said in an interview that it would not be difficult for the state’s attorney general to recognize noncompliance, though there may be delays in assessing the safety of features companies may add to their products.  

It’s become pretty obvious what it looks like to be dealing with a healthy and safe platform for a child,” she said. “If they come up with a new feature, it might take researchers and scientists time to deem that it’s not in the best interest of a child. But once that happens there are plenty of pediatricians and pediatric researchers around the country who have highlighted some of the most harmful features and techniques used by these platforms.”

Ram Hinsdale introduced the bill in a Feb. 15 meeting in the Senate Committee on Economic Development, Housing and General Affairs. 

“We were told this genie is out of the bottle, there’s nothing you can do. We can’t accept that answer anymore,” said Ram Hindsale in the meeting. “We have to understand what we’re up against, and we have to protect our kids, especially, from that and protect their privacy and protect their mental health.”

Ram Hinsdale emphasized the language in the bill is tailored to go after the massive companies operating out of Silicon Valley that affect children in Vermont, not local tech businesses. 

“There’s some new draft language that really helps put them at ease,” Ram Hinsdale said of local firms.

Along with required assessments, covered entities would need to include a plan to make sure existing and future products are designed with the best interests of children prioritized. 

Pediatrician Heidi Schumacher, an assistant professor at the University of Vermont’s medical school, testified in the Feb. 15 meeting in support of the bill, saying she spoke on behalf of her profession.

“There is no doubt that social media can play a really positive role in our lives, expanding our social networks and allowing marginalized youth in particular to thrive,” said Schumacher. “But in many cases, young people themselves believe that they are spending too much time on social media and find themselves unable to unplug because of the features intentionally designed to keep them plugged.” 

“While spending so much time online,” she said, “they are being regularly exposed to dangerous content and unhealthy habits that pose a direct risk to their health and wellbeing.”

The bill follows similar legislation enacted in California in 2022. That state’s Age Appropriate Design Code was blocked in September 2023 by a federal court judge following a lawsuit brought by NetChoice, a tech trade group. The judge agreed with NetChoice that the California law violated the First Amendment by targeting specific speakers: for-profit entities. The ruling has been controversial with children’s advocacy groups. 

“These companies aren’t making principled or nuanced First Amendment arguments,” said Meetali Jain, founder of the Tech Justice Law Project, a  Washington, D.C., group advocating for policy frameworks fit for the digital age, in the Feb. 15 meeting. “They’re making crude arguments to avoid accountability. And at the center of their strategy is an attempt to manipulate the First Amendment by saying that all corporate business activity, including collecting data on kids or how they design their products, is First Amendment speech that cannot be regulated or reined in.”

But the Vermont bill could prove more legally airtight, said Marisa Shea, senior policy manager of 5Rights, a London nonprofit aimed at kids’ digital safety, at the Feb. 15 meeting.

“This bill contains a narrowly tailored definition for best interest that does not mandate the platform to sort of go above and beyond to provide kids with the best online experience possible, because we agree that that would be too subjective,” said Shea. “Rather, it has been carefully defined to guard against well-established harms recognized for decades by our legal system.”

The Community News Service is a program in which students work with professional editors to provide content for local news outlets at no cost.

vermontbiz.com Vermont Business Magazine