WASHINGTON — Seventeen-year-old Julian Pagulayan started using social media when he was in the fifth grade.
“It was moving and seeing what other people were doing,” Pagulayan told CBS News.
However, under a bipartisan bill introduced this week, children under 13 would be barred from using social media, while those between 13 and 17 would need parental consent to create an account. Social media companies will also be prohibited from recommending content to users under 18 using algorithms.
The Protecting Kids in Social Media Act was co-sponsored by Republican Sen. Tom Cotton of Arkansas and Democratic Sen. Brian Schatz of Hawaii, both parents.
“My kids are young enough that it’s not a concern yet, but I’m very worried about it,” Cotton told CBS News.
Both Cotton and Schatz believe such a bill could be successfully implemented.
“There are a lot of mechanisms for a more robust age verification system,” Cotton said. “The age verification they’re doing now is basically asking a 12-year-old, ‘Are you 18?’ And they click, ‘I’m 18,’ and now they’re online.”
Schatz argues that the bill would give the Federal Trade Commission and individual state attorneys general the power to enforce age limits.
“We’ve made a decision as a society that you have to wait until a certain age to, say, buy alcohol or buy tobacco,” Schatz said. “We’re not so naive that we don’t think teenagers have ever smoked cigarettes or drank beer. But that doesn’t mean you just throw up your hands, there’s no solution.”
The two senators pointed to several studies that suggest a possible link between social media and mental health, including a study released in February by the US Centers for Disease Control and Prevention that found 57% of high school girls and 29% of high school girls Guys, feel sad all the time. The survey also found that 22% of all high schoolers reported that they had seriously considered suicide.
Pagulayan believes that kids her age should be able to make their own decisions about social media use.
“It’s very relevant now,” Pagulayan said. “And if a parent doesn’t see that, I think they’re blocking that opportunity for their child by not allowing them to.”
Some social media platforms told CBS News they are reviewing the law and noted they already have safeguards in place.
Antigone Davis, global head of security at Meta, the parent company of Facebook and Instagram, told CBS News in a statement that the company has “developed more than 30 tools to support teens and families.”
When teens create an Instagram account, it’s automatically set to private, and teens get “notifications encouraging them to take regular breaks,” according to Davis.
“We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we detect more than 99% of it before it’s reported to us,” Davis said. We will continue to work closely with experts, policymakers and parents on this important issue.”
A spokesperson for Snapchat’s parent company Snap told CBS News in a statement that it has “built security and privacy into the architecture of our platform and has additional protections for 13-17 year olds.”
“We are already working with industry peers, regulators and third-party technology providers on possible solutions and look forward to continuing this productive conversation with the co-sponsors of this legislation,” the spokesperson said.
It pointed to TikTok’s privacy and parental controls, including restrictions on features such as direct messaging for young teenagers and restricting accounts from sending or receiving virtual gifts or livestreams for those under 18.
Trending news
Nicole Killion