这是indexloc提供的服务,不要输入任何密码
transparency preview

Measuring progress to drive impact

With over a billion children using our members’ platforms, every safety measure matters. While prevention is difficult to quantify, we track the actions that reduce risk and strengthen protections.

Industry’s child safety metrics

These metrics and insights are self-reported and collected through our annual member survey, then aggregated to provide transparency into how industry is working to combat online child sexual exploitation and abuse.

Latest update: 2024 Annual Report.

Advancing member mileposts

Together with our members, we set and track annual mileposts — voluntary advancements across the categories of:

  • Reporting: Provide actionable reports to the appropriate authorities.
  • Transparency: Publish regular transparency reports.
  • Tooling & Investigations: Strengthen tools and enforcement processes.
  • Wellness: Develop resources to support trust and safety teams.
  • Prevention: Implement safeguards like age assurance and deterrence.
  • Detection: Deploy technologies to detect harmful content and activity.
  • User Reporting: Enable reporting by users and third parties for OCSEA.

Members advancing at least one milepost in 2024

impact,metrics,data,progress

In 2024, 44 of 45 members advanced at least one milepost, totalling 122 mileposts advanced across the membership.

Member alignment with our Trust framework

The Tech Coalition’s Trust Framework sets a clear, consistent standard for how members report on their child safety efforts.

We track how members align with this framework each year, promoting greater transparency, accountability, and shared progress in tackling online child sexual exploitation and abuse.

Members aligned to trust framework in 2024

impact,metrics,data,progress

For the past three years over half of Tech Coalition member companies have at least partially aligned their transparency reporting with the Trust Framework.

Reporting to relevant authorities

When member companies find evidence of child exploitation, they report this activity to relevant authorities, either required by law or voluntarily.

Members can submit reports manually using web forms, or integrate with a web service Application Programming Interface (API), allowing a more streamlined reporting experience.

Types of reporting used by members in 2024

impact,metrics,data,progress

44 of 45 members provided either or both manual and API reports in 2024.

Use of hash-based image detection technologies

Hashing algorithms create unique digital fingerprints of confirmed child sexual abuse material (CSAM), which can be shared so other companies can detect and act on the same content.

Many members use multiple hashing technologies in combination to improve detection and accuracy, often supplementing industry-standard tools with proprietary in-house solutions.

Members use of image hashing technologies in 2024

impact,metrics,data,progress

For the third year in a row, more Tech Coalition members adopted hashing technologies to detect CSAM images on their platforms.

Use of hash-based video detection technologies

Video hashing creates unique digital fingerprints for known CSAM videos.

In 2024, Thorn’s Scene-Sensitive Video Hashing (SSVH) was the most adopted perceptual tool among Tech Coalition members, with YouTube’s CSAI Match valued for its free, API-based access. MD5 remained the leading cryptographic hash. Many members also operate proprietary systems alongside these tools.

Member use of video hashing technologies in 2024

impact,metrics,data,progress

23 members used at least one video hashing technology in 2024, an increase from 21 in 2023.

Use of hash & keyword repositories

Tech Coalition members rely on shared repositories of known CSAM hashes, exploitative content, and keywords to improve cross-platform detection of online child sexual exploitation and abuse (OCSEA).

Typically maintained by trusted NGOs, hotlines, and civil society partners, these resources help companies quickly identify and remove harmful material.

Members use of hash and keyword repositories in 2024

impact,metrics,data,progress

In 2024, ten members adopted NCMEC’s new Take It Down repository, a free tool to help individuals remove or block explicit images of themselves shared online.

Use of classifiers to detect OCSEA

Classifiers are machine learning tools that analyze patterns in images, videos, or text to help detect CSAM and signs of exploitation, such as grooming or sextortion.

In 2024, members advanced image, video, and text classifiers with tools like Google’s Content Safety API, Thorn’s Safer Predict, and proprietary systems to improve detection. Many also introduced or enhanced text classifiers to spot grooming and CSAM-related content in conversations, searches, and user reports.

Members use of classifiers in 2024

impact,metrics,data,progress

Building on the increase seen in 2023, members further expanded their use of classifier technologies to detect CSAM in 2024.

Use of age assurance

Age assurance plays a critical role in reducing children’s exposure to online risks.

Tech Coalition members use a variety of approaches — from AI-driven tools to verified ID checks — to help create safer, age-appropriate digital spaces.

Members use of age assurance in 2024

impact,metrics,data,progress

In 2024, several members partnered with third-party vendors to enhance age assurance processes.

Use of Safety by Design approaches

Safety by Design focuses on the ways tech companies can minimize online threats by anticipating, detecting and eliminating online harms before they occur.

Members use of Safety by Design approaches in 2024

impact,metrics,data,progress

Use of deterrence measures

In addition to proactive detection measures, Tech Coalition members implement a range of deterrence strategies to discourage and prevent OCSEA on their platforms.

These measures aim to encourage minor victims to seek support and provide appropriate resources to discourage potential offending.

Members use of deterrence measures in 2024

impact,metrics,data,progress

Wellness programs for teams

Given the sensitive nature of triaging OCSEA, comprehensive wellness initiatives are essential to support team members’ long-term well-being and overall sustainability.

Tech Coalition members are committed to the safety of their teams. Interventions in 2024 included: expanded wellness programs and resources, content moderation support and exposure reduction, dedicated mental health support for moderators, structural and policy adjustments for wellbeing, as well as continuous improvement and tailored support.

Members with wellness programs for teams in 2024

impact,metrics,data,progress

41 of 45 members provide wellness programs and resources for their teams combatting OCSEA.