YouTube released new product updates a week after a lawsuit accused Meta of contributing to a youth mental health crisis.
The app will start implementing restrictions on the frequency of repeated video recommendations related to sensitive topics like body image for teenagers starting November 2nd, as announced by the company on Thursday.
YouTube is Limiting Some Video Recommendations to Protect Teenagers
These new safeguards are a result of YouTube’s partnership with the Youth and Families Advisory Committee, comprising psychologists, researchers, and experts in child development, children’s media, and digital learning. The committee has been providing guidance to YouTube for years on the potential negative impact of repeated exposure to certain online content on teenagers’ mental health.
Allison Briscoe-Smith, a clinician, researcher, and member of the Youth and Families Advisory Committee, emphasizes that an increased frequency of content that promotes unhealthy standards or behaviors can highlight potentially problematic messages. These messages can influence how some teenagers perceive themselves. Briscoe-Smith explains in a press release that having guardrails can assist teenagers in maintaining healthy patterns as they naturally compare themselves to others and decide how they want to present themselves in the world.
YouTube collaborated with the advisory committee to determine categories of videos that might be concerning if watched repeatedly. As a result, teenage viewers will no longer receive repeated video recommendations for content that involves comparing physical features, promoting specific body types or fitness levels, or displaying social aggression through non-contact fights and intimidation.
YouTube has introduced additional product updates aimed at promoting the well-being of teenagers. These updates include more frequent and prominent “take a break” and bedtime reminders. Furthermore, YouTube has expanded its crisis resource panel, which connects users searching for terms like “eating disorders” with live support from crisis service partners, offering a full-page experience. The panels will now prominently feature resources for third-party crisis hotlines and attempt to redirect search queries by suggesting topics such as “self-compassion” or “grounding exercises.
Collaboration for Online Safety: YouTube’s Educational Resources and Legal Challenges
YouTube is collaborating with the World Health Organization (WHO) and Common Sense Networks to create educational resources for parents and teenagers. These resources will offer guidance on safe and empathetic online video creation and how to respond to comments, among other topics.
These updates from YouTube are likely aimed at safeguarding itself, especially after numerous states filed a lawsuit against Meta last week, alleging that it contributed to a youth mental health crisis. In the lawsuit, the states claim that Meta knowingly introduced features that promote harmful behaviors and failed to remove content related to disordered eating and bullying.
Meta is not the sole social network facing legal issues this year. In June, a Maryland school district filed a lawsuit against Meta, along with Google, Snap, and ByteDance, the owner of TikTok. The lawsuit claims that these companies have contributed to a “mental health crisis” among students.
The lawsuit alleges, “Over the past decade, the Defendants have pursued a growth-at-all-costs strategy, disregarding the impact of their products on children’s mental and physical health. In a race to capture the ‘valuable but untapped’ market of tween and teen users, each Defendant designed product features that encourage repetitive and uncontrollable use by kids.
YouTube will start restricting repeated video recommendations to teenagers in the United States from November 2nd, with plans to expand these restrictions to other countries in 2024.