Back to Glossary

What is COPPA?

Last reviewed by Moderation API

The Children's Online Privacy Protection Act is the foundational US federal statute governing how online services can handle the personal information of children under 13. Signed into law in 1998 and enforced by the Federal Trade Commission, COPPA has quietly shaped the architecture of almost every consumer product that touches minors, from age gates and separate "kids" app variants to the ban on behavioral advertising in children's content. It now sits at the center of a much broader youth privacy debate.

Scope and core obligations

COPPA applies to operators of commercial websites, apps, connected toys, and online services that are either directed to children under 13 or that have actual knowledge they are collecting personal information from a child under 13. "Directed to children" is assessed holistically, considering subject matter, visuals, music, language, the presence of child models or celebrities, and advertising practices.

The definition of personal information is broader than many operators assume. It covers first and last name, home or physical address, email address, telephone number, Social Security number, persistent identifiers like cookies and device IDs used for tracking across services, geolocation precise enough to identify a street and city, photos, videos, and audio containing a child's image or voice, and any other identifier that could permit physical or online contact.

Before any of that can be collected, an operator has to provide clear notice, obtain verifiable parental consent, give parents access to review and delete data, and implement reasonable security. Acceptable consent methods have historically included signed consent forms, credit card verification, video conferencing with trained personnel, government ID checks, and knowledge-based authentication, with the FTC regularly approving new mechanisms.

The big enforcement actions

COPPA has teeth. A handful of headline settlements define the modern enforcement landscape:

  • YouTube and Google, 2019, $170 million, the largest COPPA penalty at the time, for collecting persistent identifiers from viewers of child-directed channels without parental consent and using them for targeted advertising
  • TikTok (Musical.ly), 2019, $5.7 million, then the largest COPPA fine ever, for knowingly collecting information from users under 13
  • Epic Games, 2022, $275 million, the largest COPPA penalty on record, for COPPA violations in Fortnite, alongside a separate $245 million FTC settlement over dark patterns
  • Amazon Alexa, 2023, $25 million, for retaining children's voice recordings and geolocation data in violation of parental deletion requests

The 2025 rule update and the youth privacy wave

In January 2025 the FTC finalized the first major update to the COPPA Rule since 2013. The update expanded the definition of personal information to explicitly include biometric identifiers, required separate opt-in consent for disclosing children's data to third parties (including for targeted advertising), tightened data retention by mandating written retention policies and prohibiting indefinite storage, and strengthened the safe harbor program. Operators have a one-year transition period to comply.

COPPA is also no longer operating alone.

A sprawling "COPPA 2.0" wave has reshaped the regulatory terrain. The federal Kids Online Safety Act (KOSA) has repeatedly passed the Senate and remains active in Congress. California's Age-Appropriate Design Code Act imposes data minimization and default-private settings for users under 18. Utah, Florida, Arkansas, and Texas have all enacted their own youth social media or age verification laws, several of which are currently being litigated on First Amendment grounds.

How platforms actually comply

In practice, compliance translates into recognizable product patterns: age gates at signup, entirely separate "kids" experiences such as YouTube Kids and Messenger Kids, bans on behavioral advertising and personalized recommendations in child-directed surfaces, shorter session times, and moderation tooling that blocks personal information sharing in chat. Moderation API and similar classifiers are commonly deployed to detect PII disclosure and adult-to-minor contact in children's products.

The tension at the center of the current debate is clear: stronger child protection often requires knowing more about users, while age verification mandates collide with the privacy principles COPPA was written to defend.

Find out what we'd flag on your platform