Back to Glossary

What is Sextortion?

Last reviewed by Moderation API

Sextortion is coercive blackmail built around intimate imagery. An attacker obtains, fabricates, or deepfakes sexually explicit content of the victim and then threatens to publish it, to family, classmates, employers, or the wider internet, unless the victim sends money or produces more explicit material.

It is one of the fastest-growing online harms of the 2020s. In its financial form, targeting teenage boys, it has become a documented driver of youth suicide.

Two very different crimes under one name

Practitioners usually split sextortion into two dominant patterns.

Financial sextortion is a high-volume, low-dwell-time scam aimed overwhelmingly at teenage and young adult boys. A fake female persona on Instagram, Snapchat, or Wizz starts a flirty chat, quickly trades an explicit image, captures the victim's response, and within minutes threatens to send it to every follower and family member unless they pay via Cash App, PayPal, gift card, or crypto. The entire attack can escalate from first message to threat in under an hour.

Coerced or adult sextortion is slower and more targeted. A former partner, a hacker who compromised a webcam or cloud account, or a long-term groomer uses intimate material as a lever to demand more content, sexual acts, or silence, often over months.

Scale, harm, and the financial sextortion crisis

The FBI, NCMEC, and Homeland Security Investigations issued a joint public safety alert in 2022 warning of an "explosion" in financial sextortion cases targeting minors. NCMEC's CyberTipline recorded a dramatic jump in sextortion reports, from roughly 139 per month in 2021 to more than 800 per month by 2023.

The FBI has publicly linked financial sextortion to more than 20 teen suicides in the United States, with additional deaths reported in Canada, the UK, and Australia.

Much of this activity has been traced to organized crews in West Africa, particularly Nigerian "Yahoo Boys" networks, whose tradecraft has been studied by the Network Contagion Research Institute and Thorn.

Deepfakes and the TAKE IT DOWN Act

Generative AI has made sextortion cheaper and scarier. Attackers no longer need a real image. A single school photo scraped from Instagram can be turned into convincing synthetic nudes using open-source "nudify" tools. In response, the US Congress passed the TAKE IT DOWN Act in 2025, which criminalizes the nonconsensual publication of intimate imagery, including AI-generated content, and requires covered platforms to remove reported NCII within 48 hours of a valid notice. The UK's Online Safety Act and the EU Digital Services Act impose parallel obligations.

Detection and defense

Technical defenses typically combine several layers:

  • Grooming detection models that flag rapid escalation from small talk to sexual content, especially across age or geographic gaps
  • Nudity and CSAM classifiers running on uploads and DMs, with PhotoDNA and NCMEC hash matching to catch known material
  • Financial pattern detection for gift-card and crypto handle exchanges shortly after explicit content
  • In-product warnings and friction when minors receive messages from unconnected adults

For victims, the clearest guidance comes from NCMEC's Take It Down service (takeitdown.ncmec.org), which generates hashes of intimate images so participating platforms can block reuploads, along with Thorn's NoFiltr and StopNCII.org.

The FBI's advice is unambiguous: stop responding, do not pay, preserve the evidence, and report to the CyberTipline and local law enforcement. Paying almost never ends the threat.

Find out what we'd flag on your platform