Back to Glossary

Self-Harm Detection

Last reviewed by Moderation API

Self-harm detection is the identification of user-generated content that expresses suicidal ideation, self-injury, or eating disorder behaviors. Best practice is to route such content to crisis support resources and trained human reviewers rather than issuing blunt removals, which can compound harm for vulnerable users.

Find out what we'd flag on your platform