WATCH: How social media platforms impact kid and teen mental health – PBS NewsHour

As more reporting emerges over how platforms like Facebook, TikTok, Instagram and Snapchat handle young users, and how the content they consume affects their mental health, focus has also turned to the role parents and caregivers play in keeping their children safe while engaging with the apps.

WATCH: Kids’ mental health, safety in the spotlight as social media execs face Congress

What responsibility are the companies taking to remedy these concerns and what do parents and caregivers need to know about how the use of these platforms may impact young children and teens?

Washington Post technology reporter Heather Kelly and Hartford HealthCare child and adolescent psychiatrist Dr. Paul Weigle joined PBS NewsHour’s Nicole Ellis on Friday, Oct. 29 to answer viewer questions on what this means for our understanding of social media and its effects on kids.

Watch the  conversation in the player above.

Below are highlights from the live discussion.

The role of parental controls

Social media companies have pointed to parental controls as a preventative measure. But experts explain that they are not an infallible solution.
One key consideration is age, Weigle said.

“Parental controls are especially important for younger kids. To give them unrestricted access to the internet. It doesn’t mean they’re not useful. They just don’t replace other traditional forms of supervision,” Weigle said.

But excessive vigilance and use of parental controls can also be counterproductive, Kelly said.

“Kids might be exploring their own sexuality or asking questions they don’t want to ask you. Even when you do use these tools you have to step back and let them breathe a little,” Kelly said.

Why social media rewards ‘more extreme ideas’

Social media sites, including Facebook, use algorithms to select which posts are seen by what people. These algorithms have come under scrutiny by experts and lawmakers over how they affect behavior and public discourse.

“There’s always going to be people posting bad things and good things and the algorithms are literally deciding which of those things you view,” Kelly said.

Kelly said social media companies such as Facebook have tried to lower some “problematic content” in the algorithm. However, users are still rewarded for more engagement, including when it is extreme.

“There’s also an incentive to get views, to get likes and comments and engagement and be a creator and I think …….


Posted on

Leave a Reply