这是indexloc提供的服务,不要输入任何密码
lantern header
lantern vertical

Advancing child safety through 
signal sharing

Every day, the world’s leading tech companies share critical intelligence on threats to child safety through Lantern, using that information to make their platforms safer, removing abusive material and bad actors.

Why Lantern was created

Bad actors can use multiple platforms in their attempts to distribute abusive imagery and and exploit children online. Through Lantern, tech companies securely and responsibly share intelligence and threat indicators related to child sexual exploitation and abuse, helping them detect and address harm that may have otherwise gone unnoticed on their platforms.

Before Lantern, no reliable framework existed to coordinate industry efforts against predators exploiting multiple platforms to evade detection. Lantern now bridges that gap, strengthening collective defense across the sector.

How Lantern works

lantern

When a company detects OCSEA on its platform, it can securely share a signal – such as a hash, URL, or username – to Lantern. Other participating companies can use that signal to independently review OCSEA-related activity or content on their own platforms. Based on their own policies, they can take appropriate action to uphold their child safety standards.

Rather, signals are pieces of information that, after careful assessment to determine their relevance to a valid violation of a company’s respective policy, can help companies identify new forms of abuse or provide additional context to better understand the full picture.

These signals might even serve as the crucial piece that completes the puzzle and uncovers an active threat to children.

What are signals?

Participating companies choose which signals to share and ingest based on their policies, platform relevance, legal obligations, and use cases. 

Lantern supports two primary categories of signals: content-based and incident-based.

Content-based signals

These relate to material being shared or discussed – including known CSAM, grooming manuals, and other illegal content in image, video, audio, or text.

These signal types may include, for example:

Examples include:

  • Hashes of known child sexual abuse material (CSAM) used to detect and prevent redistribution
  • URLs linking to web pages that host OCSEA content
  • Keywords or codewords used by offenders to evade detection in sharing or engaging with CSAM

Content-based signals play a crucial role in preventing the rapid dissemination of harmful content, particularly CSAM, across multiple platforms.

For instance, predatory actors may store and share CSAM via hosting providers and share URLs of the content on social media apps.

When a platform identifies a URL with CSAM, after reporting it to relevant authorities, they can alert other participating companies through Lantern. This allows hosting providers and other platforms to act quickly and remove the content, helping stop the spread of abuse across the internet.

Incident-based signals

These relate to behaviors or actors involved in child safety violations. Examples include:

  • Accounts engaged in grooming or solicitation for minors to create explicit content
  • Indicators of financial sextortion
  • Patterns suggesting coordinated abuse across services

By sharing these signals, companies help one another detect evolving threats, and respond more effectively – closing the gaps predators can exploit between platforms.

Lantern’s impact

Signals being shared in Lantern are producing tangible outcomes and helping to protect children from 
cross-platform abuses.

In 2024

296,336

New signals uploaded into Lantern

1,064,380

Cumulative number of uploaded signals

102,082

Accounts received enforcement actions as a result of signals shared in Lantern

135,077

CSEA URLs were blocked or removed

81

Instances of contact offenses were flagged

45

Trafficking instances were flagged

FAQs

Who is eligible to apply to Lantern?

Lantern is open to any tech company that meets the specified participation criteria and demonstrates a firm commitment to combating OCSEA is eligible to apply. Companies that already have established detection on their platforms are best suited to apply and join.

Technology vendors, NGOs, researchers, law enforcement, governments, or other entities are not eligible to become participants in Lantern. This decision aligns with the program’s primary objective of aiding industry in voluntary efforts to help keep their platforms and users safe.

How do I apply to Lantern?

Please fill out our interest form with your information, and we will be in contact after reviewing your eligibility.

How much does it cost to join Lantern?

Lantern is free for all participants. The program is fully funded by the Tech Coalition and its member companies.

Where is the information stored/ who hosts the tech for Lantern?

Lantern is hosted on the ThreatExchange platform, which was developed by Meta as a way for organizations to share information in a secure, privacy-compliant way. Meta has implemented comprehensive security measures to protect the confidentiality, integrity, and availability of all data stored by the ThreatExchange platform.

ThreatExchange was selected for use in the Lantern program after thorough review by various working groups within the Tech Coalition, as well as assessments related to security and privacy.

Lantern participants

30 companies currently participate in our Lantern program, including:

lantern
lantern
lantern
lantern
dropbox logo
lantern
lantern
lantern
lantern
lantern
lantern
lantern
open ai logo
quora logo
lantern
lantern
lantern
lantern
lantern
lantern
lantern
lantern
Get in touch

Lantern has improved the way we combat child exploitation by enabling us to see the bigger picture across platforms. Through this coordinated effort, we’ve been able to take decisive action against harmful accounts and behavior more efficiently and at a larger scale. By sharing threat intelligence with other platforms, we’re not just protecting Discord, we’re part of a cross-platform collaboration that’s creating a safer internet as a whole.

jud hoffman

Jud Hoffman

Vice President of Trust & Safety

Discord