New York was the first state to pass a law requiring the use of seat belts, in 1984, against ferocious opposition. Several other states followed and now nearly 90% of all drivers use seatbelts, saving tens of thousands of lives a year.
Today, we face another crisis, with the death-count rising. Children are developing eating disorders after being pushed addictive body-image content, taking their own lives after being pushed suicidal content, and dying because they engage in choking challenges or subway surfing challenges that the product design deliberately fed to them.
In every case, children are hurt by dangerous product design: they are profiled and targeted by algorithms that use highly sensitive personal data to target them.
A simple legal fix — as straightforward as requiring a seatbelt — could save lives. New York state Sen. Andrew Gounardes and Assemblywoman Nily Rozic have introduced a law that would turn this one design feature off by default, only to be overridden by affirmative parental consent. Without that dangerous feature, social media feeds would look like they did in 2010 — populated by content chosen by the user.
A few years ago a healthy teenage boy went on TikTok. His search history shows he looked up “Batman”, “Gym motivation”, and “bench press tips.” He did not search for suicide content. However, the TikTok algorithmic engine targeted him with more than 1,000 pro-suicide videos. He threw himself in front of a train in November 2022. If the proposed law was in place two years ago, he would have turned 18 this spring.
Or think about the 11-year-old girl who opened an Instagram account, and was immediately profiled and directed to anorexia and self-harm sites. Her eating disorder became so severe that she has lifelong impairment, needs ongoing medical attention, and has been in and out of in-patient and outpatient services. If the proposed law was in place when she signed up for Meta, she might have been free from a lifelong struggle.
The lawyer representing the families of these children, Laura Marquez-Garrett, is one of six lawyers with the Social Media Victims Center, a new law firm that started just two and a half years ago and has been inundated with horror stories. It now represents more than 2,000 families whose children have been sexually assaulted, bullied, and connected with drug users through targeted algorithms.
When I met Marquez-Garrett recently at a conference, she showed me the TikTok feed of the child who took his own life. She monitors the feed, with the parents permission, as part of showing to the world what children actually experience online. At 9:13 p.m. on the night of the conference that I met her, two years and two months after he took his own life, TikTok is still feeding him targeted content about suicide, reel after reel of short clips, “What if the only way not to feel bad?” One woman asks, “Is it not to feel anything at all?”
Recommendation systems using personal data may not sound like big metal boxes on wheels screaming down a highway, but they can be just as deadly.
I recently went on TikTok, put my birthdate as 1/14/10, shared no other information about myself, and conducted three searches: “Caitlin Clark,” “How to Make Friends,” and “How to Be Popular.” Based on the data profile TikTok made of my fictional 10-year-old self, TikTok sent me reel upon reel of teenage girls with enormous bottoms; a reel of a young woman gulping water and saying she picked up a homeless man and got her to take her to a bar, and a reel of girl complaining that her bottom was too flat.
If I were a 14-year-old girl going on TikTok without algorithmic targeting, I would be in control, following my own passions, not having algorithms bombard me.
The industry opposition to the proposed laws is, to put it mildly, at tsunami levels. Companies want to keep their profits, so they have a lot of excuses for why they need invasive recommendation systems. Their big argument recently is that removing recommendation engines would “break the internet.” For lawmakers who don’t feel confident in the technological details, this can sound daunting. But the truth is, the laws would un-break the internet.
It is time for New York to lead again. Banning the recommendation engines is the seat belt children need right now. We owe it to young people — and to the families who have already lost children, who watch with horror as the same technology finds new victims — to protect them.
Teachout is a professor at Fordham Law School and the Sidley Austin-Robert D. McLean ’70 Visiting Professor of Law at Yale Law School.