Meta, TikTok and Snap are participating in an online safety ratings system

Published: (February 10, 2026 at 11:29 AM EST)
3 min read
Source: Engadget

Source: Engadget

Overview

Numerous major social platforms—including Meta, YouTube, TikTok, and Snap—have announced they will submit to a new external grading process that scores social platforms on how well they protect adolescent mental health. The program is part of the Mental Health Coalition’s Safe Online Standards (SOS) initiative, which comprises about two dozen standards covering areas such as platform policy, functionality, governance and transparency, content oversight, and more. The SOS initiative is led by Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention.

Ratings System

The Mental Health Coalition explains that SOS “establishes clear, user‑informed data for how social media, gaming, and digital platforms design products, protect users ages 13–19, and address exposure to suicide and self‑harm content.” Participating companies will voluntarily submit documentation on their policies, tools, and product features, which will be evaluated by an independent panel of global experts.

After evaluation, platforms receive one of three ratings:

RatingDescription
Use Carefully (highest)Blue badge indicating compliance. Requirements include accessible reporting tools, clear privacy and safety settings for parents, and platform filters that help reduce exposure to harmful or inappropriate content.
Partial ProtectionSome safety tools exist but can be hard to find or use.
Does Not Meet StandardsFilters and content moderation do not reliably block harmful or unsafe content.

Participating Companies

  • Meta (including Instagram and Facebook)
  • YouTube
  • TikTok
  • Snap
  • Roblox – recently faced accusations regarding child wellbeing.
  • Discord – has strengthened age‑verification processes amid child‑endangerment concerns.

Background on the Mental Health Coalition (MHC)

  • Founded: 2020.
  • Early Partnerships: Mentioned Facebook and Meta as partners from the organization’s inception.
  • 2021: Announced collaboration with “leading mental health experts” and Meta/Instagram to destigmatize mental health during the COVID‑19 pandemic.
  • 2022: Published a case study with “support from Meta” showing that mental‑health content on social media can reduce stigma and increase the likelihood of seeking resources.
  • 2024:
    • Launched the Time Well Spent Challenge (in partnership with Meta) encouraging parents to have meaningful conversations with teens about healthy social‑media use.
    • Introduced Thrive, a program allowing tech companies to share data on material that violates self‑harm or suicide‑content guidelines; Meta is listed as a “creative partner.”

Recent Controversies Involving Meta

  • Project Mercury (2020): Internal research allegedly showing the ill effects of Meta’s products on users’ mental health.
  • Legal Actions: Meta is on trial in California over allegations of child harm from addictive products, marking the first of several upcoming lawsuits.
  • Mitigation Efforts: Introduction of Instagram teen accounts and other minimal attempts to address mental‑health concerns.

References

0 views
Back to Blog

Related posts

Read more »