'Generation Swipe': What a new era of social media regulation means for young people
If your last social media scroll was within the past 24 hours, you're one of more than half of Americans who say they use social media on a regular basis, according to a study released in January by the Pew Research Center.
Whatever your social media of choice may be, a laundry list of platforms are stitched into the fabric of how Americans work, date, think and live. And young people, whose brains are not yet fully developed, are no exception. Ninety-five percent of teens have social media accounts, and one in three report using it "almost constantly," the Pew Research Center further reports.
Yet despite 13 being the youngest legal age to sign up for most social media platforms, a 2023 advisory issued by U.S. Surgeon General Vivek Murthy found that 40% of children ages eight through 12 remain active social media users.
The report cited data that when teens spend more than three hours per day on social media, they become more likely to develop sleep disorders, anxiety and depression.
"A lot of my social interaction happens on social media … like, my whole life revolves around it," Juniper Galvani, a 17-year-old student from Vermont advocating for the Vermont Kids Code, told ABC News.
The Vermont Kids Code is described on its website as "a consumer protection bill that would require online products reasonably likely to be accessed by children under 18 to be age-appropriate, institute privacy by design and default, and be designed with kids' best interests."
Vermont is one of the many states working to pass similar legislation, with the stated goal of making social media safer for kids by holding tech companies responsible for age verification and privacy protection, and for removing videos that could cause harm to minors.
A bill that mandates similar rules for big tech, known as the Maryland Kids Code, was signed into law by Gov. Wes Moore earlier this month. It requires social media sites to review their data privacy policy, and if it's determined that a child is likely to use their site or platform, safeguards for parents and stricter privacy laws for children must be implemented by Oct. 2024.
During the signing ceremony, Todd and Mia Minor, who lost their son to a social media challenge, stood next to the Maryland governor, witnessing a moment they'd spent the last five years fighting for: legislation designed to protect kids online.
It was on March 7, 2019 when Todd and Mia's son, Matthew Minor, asked to play on his computer after dinner. "You have one hour" before having to get back to his homework, his father, Todd, told him.
That hour would change Todd and Mia's life forever. Matthew was found unresponsive by their older brother.
"He yelled down … come upstairs quick," Todd Minor told ABC News' Elizabeth Schulze. "We saw Matthew had something around his neck. We didn't know what was going on with that, but we got that from around his neck."
"All I could think was while I was doing the CPR was, I was asking God to take me instead of my child," the Maryland father and U.S. Army veteran added.
"He had the biggest dimples. He always smiled," said Mia Minor. "We haven't done anything different to his room. I mean, it's exactly pretty much the same."
Matthew was known in school for his charisma, energy, and for standing up for children being bullied, his parents say. He also was an ambassador at his school, welcoming new students and showing them how and where to find their classes.
And after looking through Matthew's devices, police detectives told the Minor family about certain challenges circulating on social media.
Children and teens online were calling it "the choke-out challenge," or "choking game" – a dangerous viral trend in which people on social media would intentionally try to choke themselves to enter a brief state of euphoria.
"It was very cartoonish, fun and playful. And so from a kid's standpoint, it was just, 'Why not?'" Todd Minor said of his reaction to searching these videos after Matthew passed away. "They have their electronics in their pockets. They're getting hit with social media on a regular: 'Try it, try it. You must try it. You must try it.'"
It was at Matthew's memorial when many of Matt's classmates revealed to Todd and Mia that they, too, had tried the challenge.
"That's when we were starting to think something needed to be done," said Todd.
Turning their pain into purpose, Todd and Mia launched the Matthew E. Minor Awareness Foundation to highlight the risks of viral challenges on apps and websites like TikTok, YouTube, and Instagram.
"You know, because Matthew can't speak for himself and the other kids that are out there that can't speak for themselves and, and the kids that have reported things in the big tech companies haven't done anything about it," Todd said of their reasons for starting the foundation.
Todd and Mia Minor have now made it their life's mission to share Matthew's story whenever and wherever they can, from churches to classrooms to Capitol Hill. In January, they sat just rows behind the CEOs of five of the biggest tech companies during a Senate hearing about child safety online.
Midway through the hearing, Meta Chief Executive Officer Mark Zuckerberg – who's company owns the online social platforms Facebook, Instagram, WhatsApp and Threads – stood, turned around and apologized directly to the Minors and other parents who had lost their children because of social media.
"I'm sorry for everything that you've gone through, it's terrible. No one should have to go through the things that your families have suffered," Zuckerberg said.
Todd Minor told ABC News he was "happy to see" the tech giant CEO "say something."
"I started looking at it through Matthew's eyes. And I started feeling sorry for Mark Zuckerberg, to be honest, because I could see things churning in his head," Minor added.
Snapchat, Meta, YouTube and TikTok sent statements to ABC News, all highlighting features added to platforms, which they said make the user experience safer for kids.
"If you had a car seat and 50 kids broke their arm because of a manufacturing defect in that car seat, we would recall that car seat," Frances Haugen told ABC News. "And yet, every year, thousands of kids in the United States alone are seriously harmed by social media, and yet we don't that."
Haugen had been working for Facebook for about two years when, in 2021, she left the company and subsequently disclosed tens of thousands of internal documents from her former employer that claimed Facebook and Instagram were ignoring risks to young users, and prioritizing profits instead.
"Facebook had choices to make their products safer for kids. And yet they repeatedly chose not to. All the while lying to the public," Haugen said.
Her Capitol Hill testimony in October of 2021 prompted swift pushback from Zuckerberg, who posted in Facebook: "I've spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it's very important to me that everything we build is safe and good for kids."
After Haugen leaked the documents, and also in reaction to the surgeon general's warning the previous May, a bipartisan group of 33 state attorneys general in October 2023 filed a major lawsuit against Meta, citing an internal company email saying "the lifetime value of a 13-year-old teen is roughly $270," meaning how much they were worth in revenue, and accusing the tech giant of intentionally designing algorithms and notification features that encourage compulsive use.
In response to the lawsuit, Meta said, "We share the attorneys general's commitment to providing teens with safe, positive experiences online, and have already introduced over 50 tools to support teens and their families."
Director of Mount Sinai's Parenting Center Dr. Aliza Pressman emphasized to ABC News the timeline of adolescent brain development, noting that teens "are not going to have all the skills that are really beneficial to being able to work well with social media."
As stated previously, states across the nation are working on passing stricter legislation intended to foster social media transparency, accountability, and safety for young users.
Florida Gov. Ron DeSantis signed one of the strictest social media bills into law in March, banning anyone under the age of 14 from opening or having a social media account. Teens who are 14 and 15 would need parental consent to use social media. Tech companies could face fines for not removing underage users, and parents could sue them for up to $10,000.
"We see adults who can't regulate themselves on social media. But we're expecting, you know, children 13, 14, 15, even younger to be able to regulate," Florida Democratic Rep. Michele Rayner, who spearheaded the legislation, told ABC News. "We are putting the onus back on the platforms to comply with Florida law."
Critics have said that the Florida law, which is set to take effect in January 2025, would put Florida children and teenagers who have created successful businesses and community from social media at a disadvantage.
"There's two sides to this. I know there's kids who just like using, you know, scroll for hours, like all day. But then there's other kids like me who want to, like, promote themselves," said Marley Desinord, a 13-year-old DJ for the Miami Heat. "So it's like if it's kind of like a gray area there."
Desinord started posting her mixing tracks on social media at seven years old. It led to success, making her the youngest DJ for the NBA, something she tells us "wouldn't be possible without social media."