Back to Blog
Legal
December 16, 20247 min read

The TAKE IT DOWN Act: What It Means for Victims

New federal law requires platforms to remove non-consensual intimate images within 48 hours. Here's everything you need to know about your new rights.

SC
Sarah Chen
Content Protection Specialist
DMCA ProcessPlatform Policies

In 2025, the TAKE IT DOWN Act became federal law—the most significant legislation ever passed to protect victims of non-consensual intimate imagery. If someone shares your intimate images without consent, platforms now have 48 hours to remove them. Here's what this means for you.

What the TAKE IT DOWN Act Does

The TAKE IT DOWN Act (officially the "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks") creates federal criminal penalties and platform requirements for non-consensual intimate imagery (NCII).

Key Provisions

  • 48-Hour Removal Requirement: Online platforms must remove reported NCII within 48 hours of receiving a valid request
  • Covers AI Deepfakes: The law explicitly covers AI-generated intimate imagery—if someone creates fake explicit content of you using AI, it's covered
  • Criminal Penalties: Publishing NCII without consent is now a federal crime with potential prison time
  • Duplicate Removal: Platforms must also remove duplicate copies of reported content
  • Minor Protection: Enhanced penalties for content involving minors

How This Changes Things for Victims

Before the TAKE IT DOWN Act

Previously, victims had to rely on a patchwork of state laws (which varied wildly), DMCA copyright claims (which required proving ownership), or platform terms of service (which were inconsistently enforced). Some platforms took weeks or months to respond. Others ignored requests entirely.

After the TAKE IT DOWN Act

Now there's a federal standard. Every major platform operating in the US must:

  1. Provide a clear process for reporting NCII
  2. Respond to valid reports within 48 hours
  3. Remove the content (if verified as NCII)
  4. Make reasonable efforts to remove duplicates

What Counts as "Non-Consensual Intimate Imagery"

The law covers:

  • Real photos or videos showing nudity or sexual activity, shared without consent
  • AI-generated "deepfake" content depicting someone in sexual situations
  • Digitally altered images (e.g., someone's face placed on explicit content)

Importantly, the law covers images that were originally taken consensually (like in a relationship) but shared without consent. The fact that you once consented to the creation doesn't mean you consented to distribution.

How to Use the TAKE IT DOWN Act

Step 1: Document the Content

Before reporting, screenshot the content and URL. Note the platform, date discovered, and any visible usernames. This documentation helps if you need to escalate or pursue legal action.

Step 2: Use the Platform's NCII Reporting Process

Major platforms have dedicated reporting flows for intimate imagery. Look for options like "Report," "Non-consensual intimate images," or "Intimate privacy violation." Don't use generic copyright or harassment reports—use the specific NCII option.

Step 3: Provide Required Information

Most platforms will ask for:

  • Links to the specific content
  • Confirmation that you are the person depicted (or authorized to report)
  • Statement that the content was shared without consent

Step 4: Wait (But Not Long)

Under the TAKE IT DOWN Act, platforms should respond within 48 hours. If they don't:

  • Follow up with another report
  • Escalate to the platform's legal or trust & safety team
  • Consider filing a complaint with the FTC
  • Consult with a lawyer about the platform's non-compliance

Criminal Penalties for Perpetrators

The TAKE IT DOWN Act makes publishing NCII a federal crime. Penalties can include:

  • Fines
  • Up to 2 years in federal prison (more for repeat offenders or aggravating factors)
  • Up to 3 years if threats or harassment are involved
  • Enhanced penalties for content involving minors

These federal penalties are in addition to any state-level criminal charges, which exist in 49 states.

Limitations to Know About

Not Every Platform Complies Perfectly

While major US platforms are covered, enforcement varies. Some smaller or offshore platforms may be slower to respond or may not comply at all. The law gives you more leverage, but doesn't guarantee instant removal everywhere.

Search Engines Are Separate

Getting content removed from a website doesn't automatically remove it from Google. You'll still need to file separate removal requests with search engines. The good news: Google has its own NCII removal process that works quickly.

Encrypted Platforms Are Challenging

Platforms with end-to-end encryption (like Signal, or Telegram's secret chats) can't see the content on their servers, making removal complex. The law still applies, but enforcement is harder.

Resources

  • Cyber Civil Rights Initiative: 844-878-CCRI (2274) — 24/7 crisis helpline
  • StopNCII.org: Create hashes of images to prevent spread
  • Take It Down (NCMEC): Tool specifically for minors

Need Help with Removal?

Even with the TAKE IT DOWN Act, dealing with multiple platforms can be overwhelming. We can handle all the reporting and follow-up for you—quickly and confidentially.

Get Confidential Help →

About the Author

SC
Sarah Chen
Content Protection Specialist

Sarah focuses on helping victims navigate the content removal process. She writes about digital rights, platform policies, and the legal landscape around non-consensual imagery.

DMCA ProcessPlatform PoliciesDigital Rights