Experts sound the alarm over social media as kids grapple with declining mental health

Scientific data, health experts and lawsuits all are screaming the warnings louder than ever about the dangers of social media to kids.

Many are putting the blame squarely on social media companies for creating mental health disorders for a generation of children. Teenagers these days have endless possibilities just in the palm of their hands.

"My friends are snapping someone or scrolling through TikTok," said Sydney Willis.

In that world, the perception of perfection is now constant and relentless. It’s hard for teens not to fall into the trap of social comparison.

"The ‘that’ girl—the ‘it’ girl—that perfect person that wakes up all pretty," Willis said.

"I also saw my friends hanging out without me, so that kind of controlled my mind a little bit," Zoe Townsend said.

Both girls say their top concern right now is cyberbullying, even when the hurt may be unintentional.

"I think there [are] so many versions of cyberbullying," Willis said.

"It brings down self-confidence, which brings anxiety up," Townsend said.

You would think that feeling bad would keep people away. But, according to Harvard’s School of Public Health, negative emotions are the strongest emotions, and they keep us on the screen longer.

Experts at Harvard say platforms like Instagram are exploiting those emotions, using algorithms to keep users in a loop.

For example, a friend posts a picture of their pet. You would probably like it, and then move on.

But say you see a post that upsets you—you leave an angry emoji, maybe a comment, and even share the post with your friends to show them how upsetting it is. Just like that, you have tripled your engagement and spent much more time on the platform because you were upset.

The more time you spend on the platforms, the larger the profit for those social media companies.

Harvard experts say they are in the business of selling your attention, and they monetize that time with ads.

That kind of mind and emotion control for money is so concerning that Seattle Public Schools and the Kent School District are suing Facebook, YouTube, Snapchat, Instagram and TikTok.

PREVIOUS COVERAGE: Seattle schools sue tech giants over social media harm

The lawsuits are accusing the social media companies of substantially increasing mental health disorders among kids.

When it comes to TikTok, SPS’s lawsuit is accusing them of using powerful algorithms that funnel minors to harmful content. They say kids are being exposed to "endless spools of content about sex and drugs."

The filing also highlights a Wall Street Journal investigation where journalists created bot accounts, some posing as people as young as 13-years-old, to see what the algorithm fed them.

Their investigation found that these bots were shown thousands of inappropriate and harmful videos. In one extreme case, the investigation found that TikTok allowed content to reach the bots—describing how to tie knots for sex to fantasies of rape.

Seattle Public Schools says from 2009 to 2019, there has been an average 30% increase in kids feeling sad and hopeless.

They also say more kids are considering and attempting suicide.

There are more behavioral issues to handle on school grounds, creating a desperate need for more counseling and student support. The district thinks social media companies should foot the bill.

"I’m seeing a pretty big increase in both anxiety and depression, especially in teenagers," Megan Osterman said.

Osterman is a licensed marriage and family therapist with Anchor Light Therapy Collective in Seattle.

She says social media is one of the top issues that come up with families. Osterman says kids under 13 are usually too young for social media.

"I feel like social media companies do have some responsibility, but ultimately, kids are going to be exposed to dangers all over the place," Osterman said.

Likewise, Willis and Townsend’s moms are not sure exactly how a lawsuit would help, but they are aware of the dangers of social media.

"The warnings are super scary, you know. I mean the data, that’s why we are trying to navigate, we have [this] horrible situation we want to keep our kids away from," said Lisa Willis, Sydney’s mom.

That’s why in their homes, they have rules for their middle schoolers.

For Sydney, she is not allowed to have TikTok or Instagram. She gets 45 minutes a day on Snapchat, and anything more needs to be approved by her parents.

For Zoe, it’s no TikTok and Snapchat.

"She knows the rules; if we are consistently enforcing them, it’s easier," said Tara Townsend, Zoe’s mom.

One big rule in both houses—their phones stay outside their rooms when it’s time for bed.

"Mom and dad can go on, we can look at any[thing] and everything. We have a family code," said Tara.

"Communication with me is really important," said Lisa.

Both recognize the pros and cons of social media, and they are an example of what experts say we all should be doing.

"Really foster open communications; that is probably the most important thing you can do," Osterman said.

RELATED: Seattle Public Schools' lawsuits over social media harm face tough legal road

Get breaking news alerts in the FREE FOX 13 Seattle app. Download for Apple iOS or Android. And sign up for BREAKING NEWS emails delivered straight to your inbox.

A Snapchat spokesperson released this statement:

"Nothing is more important to us than the wellbeing of our community. At Snapchat, we curate content from known creators and publishers and use human moderation to review user generated content before it can reach a large audience, which greatly reduces the spread and discovery of harmful content. We also work closely with leading mental health organizations to provide in-app tools for Snapchatters and resources to help support both themselves and their friends. We are constantly evaluating how we continue to make our platform safer, including through new education, features and protections."

Google owns YouTube,  and released this statement to FOX 13 News about the lawsuits:

"We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well being. For example, through Family Link, we provide parents with the ability to set reminders, limit screen time and block specific types of content on supervised devices."

TikTok says they cannot comment on litigation, but say they prioritize the safety and wellbeing of teens. They highlighted their age restriction features and parental controls.

The platform says they also put in ways to limit direct messaging and live streams.

FOX 13 did not hear back from Meta, which owns Instagram and Facebook.

Mental HealthEducationSeattleNews