Thorn’s cover photo
Thorn

Thorn

Non-profit Organizations

Manhattan Beach, CA 32,871 followers

About us

We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.

Website
http://www.thorn.org
Industry
Non-profit Organizations
Company size
51-200 employees
Headquarters
Manhattan Beach, CA
Type
Nonprofit
Founded
2012
Specialties
technology innovation and child sexual exploitation

Locations

Employees at Thorn

Updates

  • View organization page for Thorn

    32,871 followers

    Investigators in the Internet Crimes Against Children (ICAC) Task Force are on the frontlines of the fight against child sexual exploitation—identifying abuse, finding victims, and stopping perpetrators. With 61 task forces and more than 5,400 officers nationwide, their mission is relentless. They sift through staggering amounts of child sexual abuse material (CSAM), searching for critical evidence. Some cases take weeks, even months. Behind every image or video they analyze is potentially a child in danger. They need every advantage. That’s why many turn to Thorn’s CSAM Classifier—our AI-powered tool that helps law enforcement identify which files are likely to be CSAM. This means ICAC investigators can identify victims faster while reducing their exposure to the traumatic material. These investigators are heroes, and we’re making sure they have the tech and support to work smarter and bring justice quicker.

    • No alternative text description for this image
  • View organization page for Thorn

    32,871 followers

    🚨 1 in 4 young people report being offered something of value in exchange for a sexual interaction online before turning 18. Minors are being presented with gamification-like incentives for sexual photos, chats, or activities—and this rise of commodified sexual interactions is redefining how we think about online exploitation. Today’s kids are growing up in digital environments where likes, followers, and virtual rewards hold real or perceived value—making the risks of sexual interactions harder to recognize. Thorn’s latest research found that among minors who engaged in a transactional sexual exchange: 🔺 58% received money 🔺 33% received social opportunities (online followers, party invitations, etc) 🔺 28% received material goods (clothing, beauty products, etc) 🔺 9% received gaming currency Digital spaces weren’t designed with the social development needs of young people in mind, yet today’s teens navigate their formative years in these environments. What can we do to safeguard children? Develop collaborative solutions that bring together technology platforms, policymakers, parents, and others across the child safety ecosystem to address and account for these evolving dynamics and prevent harm. Everyone has a role to play in protecting kids growing up in our digital-first world. Read more about the research: https://lnkd.in/ePwxhZKx

  • View organization page for Thorn

    32,871 followers

    1 in 8 teens (aged 13-17) know someone who has been targeted by AI-generated deepfake nudes—a number that’s only growing as these tools become even more accessible. For victims, the impact is devastating and lasting. Fear, shame, and a loss of control don’t disappear when the screen turns off. Yet, laws and platform protections are struggling to keep up with evolving threats and shifting tactics, leaving too many without help. Thorn is committed to exposing these dangers and driving strategies to protect children online. This is a crisis that demands attention—from policymakers, tech platforms, and supportive adults alike. A recent USA TODAY article tells the story of a teenage girl victimized and impacted by deepfake nudes created by her fellow classmate. Read the full story linked below.

    • No alternative text description for this image
  • View organization page for Thorn

    32,871 followers

    We know that words shape how we defend children from sexual abuse and exploitation in the digital age. And the wrong words can cause real harm. That’s why we are proud to support ECPAT International's release of the Second Edition of the Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse. Outdated terms, such as “child pornography,” not only fail to capture the reality of the situation but also reinforce stigma, shift blame onto children, and obscure the truth about abuse. The new Guidelines serve as a valuable resource for anyone involved in writing, reporting, legislating, or working to protect children. Learn more here: https://lnkd.in/eJCZFGiK

  • View organization page for Thorn

    32,871 followers

    𝗘𝘃𝗲𝗿𝘆 𝗽𝗮𝗿𝗲𝗻𝘁 𝘀𝗵𝗼𝘂𝗹𝗱 𝗰𝗼𝗻𝘀𝗶𝗱𝗲𝗿: 𝗵𝗼𝘄 𝗮𝗿𝗲 𝗺𝘆 𝗸𝗶𝗱𝘀 𝗶𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝗻𝗴 𝗼𝗻𝗹𝗶𝗻𝗲, 𝗽𝗮𝗿𝘁𝗶𝗰𝘂𝗹𝗮𝗿𝗹𝘆 𝗶𝗻 𝗴𝗮𝗺𝗶𝗻𝗴 𝗲𝗻𝘃𝗶𝗿𝗼𝗻𝗺𝗲𝗻𝘁𝘀? Last week, our VP of Research & Insights, Melissa Stroebel, joined a powerful conversation hosted by World Childhood Foundation USA to discuss children's experiences in online gaming environments. As part of the Inaugural Spring Luncheon panel — alongside child safety ecosystem partners Ryan Ellis, Nicki Reisberg, and moderator Juju Chang of ABC News — Melissa shared what we’re hearing from kids about the threats they face in gaming and online social spaces, what parents need to know, and what we can all do to help keep children safe. 🙏 A special thank you to World Childhood Foundation USA for hosting this event, including Melissa and Thorn's perspective, and encouraging important dialogue so everyone is better equipped to do their part.

  • View organization page for Thorn

    32,871 followers

    Sextortion is North America’s fastest-growing cybercrime targeting children—especially boys aged 13 to 17, having claimed the lives of at least 30 deaths of teen boys by suicide since 2021. USA TODAY recently shared the stories of Braden, Carter, Evan, Gavin, James, Jordan, and Riley who were manipulated, shamed, and blackmailed by online predators. Within hours of being targeted, they felt they had no way out. Reporting on these tragedies is important. Each victim has family, friends, and a community who deeply care about them and are heartbroken by the cruelty of these crimes. Awareness is the first step in protecting kids from the devastation of sextortion, and it’s up to supportive adults in a variety of roles to do their part. Three things you can do right now:: ✔ Talk to your teens. Make sure they know they can come to you, no matter what. ✔ Break the stigma. Being targeted is never the victim’s fault. ✔ Educate others. Inform youth about the tactics that online predators may use. Read the article to understand what these boys faced. https://lnkd.in/ee2cUP6Z

    • No alternative text description for this image
  • Thorn reposted this

    View profile for John Buckley

    Child Safety Professional

    It's nearly a year since Google publicly committed to Thorn's genAI principles (Thank you Dr. Rebecca Portnoff for the continued collab!). I'm really proud of the work that the team has done on genAI; building, testing and protecting models with the prevention of CSEA to the fore of our minds. Google has published a whitepaper on "Responsible AI and Child Sexual Abuse and Exploitation Online" detailing how we take the Thorn principles and apply them. This is a good start on being transparent on safety measures, with more transparency required to build trust and share best practiecs. The most challenging modalities for CSAE risk are not widely launched yet (e.g. image editing), so we must continue to push for children's rights and safety to be at the core of our work. Huge thanks to the team behind this report; Fernanda Bernaldo, Alexis R., Lauren Rock. And huge thank you to those leading on implementing this at Google, you do such critical work; Claire Lilley, Ashley F., Melissa Benjamin, Akshay J., Griffin Hunt, Grigory Rozhdestvenskiy, Snow S, Enrique Bravo Velázquez, Emily Cashman Kirstein https://lnkd.in/ebhgPV8h

  • View organization page for Thorn

    32,871 followers

    Thank you to everyone who participated in last week’s TOFU Challenge Marketing Hackathon in NYC, graciously hosted by @Suru Labs! The creativity, collaboration, and passion for our mission made this a meaningful and fun experience. Defending children from sexual abuse and exploitation isn’t easy. Helping the world understand and support Thorn’s work is critical, but it’s no simple task. This meeting-of-the-minds brought fresh perspectives that will surely lead to important conversations that help push our efforts forward. Thank you to everyone involved for being part of this journey with us!

    View organization page for Suru Labs

    268 followers

    "Alone we can do so little; together we can do so much." – Helen Keller Yesterday, we experienced the magic that happens when 50 brilliant marketing minds come together with a shared purpose: to help Thorn spread awareness of online child sexual abuse and exploitation. The TOFU Challenge Hackathon wasn’t just about generating ideas, it was about collaboration, creativity, learning, and using marketing skills to make a real impact in the world. 8 teams of marketers with a variety of backgrounds and experience gathered in New York City to share their knowledge, collaborate, and craft marketing campaigns that amplify Thorn's mission. From the first brainstorm to the final pitches, the energy in the room was electric. ⚡ And the level of creativity and passion was inspiring. A huge THANK YOU to all the participants who spent the day pouring their talent into this challenge. 👏 Thank you to our judges Cassie Coccaro, Caryne Say, Lynn P., Emir Atli, and Joe Lazer (FKA Lazauskas) who brought invaluable insights while vetting each team's solution. And to our partners HockeyStack, PlayPlay, Digital Marketers NYC and Thorn who made this event possible. This was more than a networking event—it was proof of what happens when diverse perspectives come together to solve meaningful problems. This is the power of collaboration and we are incredibly grateful and honored to be able help facilitate it. 💜 #ThornHackathon #Collaboration #MarketingForGood #TOFUChallenge

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
      +8
  • View organization page for Thorn

    32,871 followers

    Generative AI has made content creation faster and easier than ever—but bad actors are exploiting it to produce child sexual abuse material (known as CSAM) at a truly alarming rate. The Internet Watch Foundation (IWF) reports a sharp rise in AI-generated CSAM (known as AIG-CSAM), with 99% found on public sites rather than hidden on the dark web. Worse, 78% of reports came from people who stumbled upon this content via forums and AI galleries. That means a vast majority of this criminal imagery is being accidentally encountered by the general public rather than through dedicated investigative efforts. AIG-CSAM: 🔎 Overwhelms investigators – Law enforcement faces an uphill battle as the sheer volume of AI-generated material makes it harder to identify real victims. 🚨Enables new forms of exploitation – Bad actors can manipulate or create CSAM with AI, using it for grooming, blackmail, and abuse. 🚧 Lowers barriers to harm – The ease of generating this content without a real child present fuels the misconception that it’s “harmless,” which perpetuates abuse. 📢 Spreads dangerous knowledge – AI models can support predators by offering explicit instructions on abuse tactics, making it easier to exploit children. AI must have safeguards to prevent harm. We must work to ensure that ethical principles and mitigations are implemented for AI to protect kids.

    • No alternative text description for this image
  • View organization page for Thorn

    32,871 followers

    🚨 Canadian Centre for Child Protection Inc.’s Project Arachnid receives an average of seven reports of sextortion every day. Tips like these can help young people stay on the lookout for bad actors across social platforms. In the news segment, Camillia Layne of Project Arachnid shares important tips for identifying suspicious social media accounts that may be targeting youth for sextortion. ▶️ Watch the video to know what to look for.

Affiliated pages

Similar pages

Browse jobs

Funding

Thorn 2 total rounds

Last Round

Grant

US$ 345.0K

See more info on crunchbase