News

Parental concern over social media use unites unlikely allies in Congress

social media use, tiktok: young person laying down opening TikTok on phone
phBodrova/Shutterstock

Earlier this year, U.S. Surgeon General Vivek Murthy published a public call to action on social media and youth mental health, urging technology companies, researchers, families and young people to gain a better understanding of the full impact of social media use and create safer, healthier online environments. 

“The most common question parents ask me is: ‘Is social media safe for my kids’,” Murthy said in a press release. “The answer is that we don’t have enough evidence to say it’s safe, and in fact, there is growing evidence that social media use is associated with harm to young people’s mental health. Children are exposed to harmful content on social media, ranging from violent and sexual content, to bullying and harassment.”

Dr. Murthy cited research showing that adolescents who spend more than three hours daily on social media face double the risk of experiencing poor mental health outcomes like depression and anxiety, and other research showing most kids today are already there, with daily use currently averaging around four to five hours.

A 2022 Pew nationally representative survey found that nearly half of parents reported being  “highly worried” that their teens will see explicit content on apps like Snapchat, Instagram and TikTok.

The broad worry among parents over social media use has brought together unlikely allies in the Senate, which Commerce Committee Chair Sen. Maria Cantwell, D-Wash., has said is on track to pass a package of child online safety measures this year.

Pending child online safety legislation includes the Kids Online Safety Act, which progressive Democrat Sen. Richard Blumenthal of Connecticut and conservative Republican Sen. Marsha Blackburn of Tennessee introduced in May.

If passed, the Kids Online Safety Act would require that social media platforms provide minors with options to protect their information, disable addictive product features and opt out of algorithmic recommendations. It also calls for tech companies to perform an annual independent audit that assesses the risks to minors, their compliance with the legislation and whether their platform is taking meaningful steps to prevent those harms. 

In the press release announcing the bill, Blumenthal laid out a broad indictment against apps like TikTok, Instagram and Snapchat.

“Record levels of hopelessness and despair — a national teen mental health crisis — have been fueled by black box algorithms featuring eating disorders, bullying, suicidal thoughts, and more. Kids and parents want to take back control over their online lives,” he said.

In an email earlier this year, Snapchat said that it doesn’t promote unvetted content to its community using algorithms, and teens must be mutual friends before they can start communicating with each other. In 2020, the app launched Here For You, a proactive in-app support system for Snapchatters who may be experiencing a mental health or emotional crisis.

“As a messaging service for real friends, we applaud the Surgeon General’s principled approach to protecting teens from the ills of traditional social media platforms,” a Snapchat spokesperson said. “Snapchat doesn’t encourage perfection, popularity or stranger contact, and we don’t broadly distribute unmoderated public content, which helps prevent the promotion and discovery of potentially harmful material.”

And TikTok, long the target of U.S. attacks because of its origin in mainland China, earlier this year announced that every account belonging to a user below age 18 would automatically be set to a 60-minute daily screen time limit. If the 60-minute limit is reached, teens are prompted to enter a passcode to continue watching. For those 13 and under, a parent or guardian must set or enter an existing passcode to enable 30 minutes of additional watch time. 

The Electronic Frontier Foundation, a nonprofit that describes itself as defending digital privacy and free speech, said the proposed legislation would lead to censorship and privacy invasions.

“The common critiques of the bill are that it requires surveillance of minors by parents, that it would lead to huge holes in what information and platforms are accessible by young people online, and that it would force all users to upload their IDs to verify their ages,” EFF said in a press release, decrying “one-size-fits-all legislation.”

Researchers and activists have also stepped up to share their findings on social media’s impact on kids. Alex Karydi, director of state and community Initiatives for the Education Development Center’s Suicide Prevention Resource Center project, said organizations are using the platforms to raise awareness of and reduce stigma around mental health issues.

She said their focus groups with young people reveal that it may be a potential protective factor, a place where they can access information and find support.

“I think one of the biggest issues with suicide is that it’s really the depth of loneliness or a lack of belonging, right?” she said. “And so, we really want to do all we can for the person to get a sense of belonging, to feel like they’re part of a community, that they don’t feel so alone. And in many instances, social media is a community. It has people in it that can kind of uplift you or support you or love you or encourage you.”

Even the Surgeon General’s advisory acknowledges the positive effects of social media use for youth, noting that nearly 60% of adolescents report that the platform helps them feel more accepted and 67% like they have people who can support them through tough times, he said, while online social support may be especially important for racial and ethnic minorities and LGBTQ youth.

The problems associated with social media use are largely dependent on how that social media is used, said Pamela Rutledge, director of the Media Psychology Research Center at Fielding Graduate University in Santa Barbara, California. She cited research showing kids who were active on social media during the COVID lockdowns had a “much better mental health outcome than the kids who were totally isolated.”

She said much of the evidence regarding teen mental health and social media use is correlational. It’s equally likely that a depressed person will go on social media than it is that social media is making a person depressed, she said.

“I don’t doubt that the researchers who are standing up for this have good intentions,” she said. “But I think it’s the wrong way to go about it if you consider it from a developmental perspective.

“The whole job of a parent raising your kids is to try and give them the skills that they’re going to need to successfully navigate the world,” she added. “So, you want to teach them how to set boundaries, you want to teach them how to make friends, you want to teach them how to do all these things. You don’t do that by restricting the major social portal of their life because you’re worried.”

Jacqueline Nesi, an assistant professor of psychiatry and human behavior at Brown University in Providence, Rhode Island, said that even if the research is not definitive, there is some indication that social media may be causing harm to kids.

“That’s enough to encourage us to really take this safety-first approach, meaning that we are taking reasonable steps to minimize the risk of harm on social media for young people and to protect them when they’re in these spaces,” she said.

Restricting access to apps that have become integral parts of young people’s lives could make these platforms more enticing, said Deborah J. Cohan, a professor of sociology at the University of South Carolina Beaufort. She compared it to telling a teenager not to have sex or drink alcohol.

“A more honest, open and forthright approach is what’s usually necessary with these things, especially with young people,” Cohan said. “They totally benefit from adults who are willing to engage with them on difficult conversations about sensitive and complicated topics. When people take these sorts of rigid black and white rules and impose them, that’s when I think either young people shut down or they just miss it, or then they just start to do it anyway, and it’s riskier.”

***

Megan K. Scott is a Delray Beach, Florida, journalist who covers health, education and social issues.

To Top
Skip to content