AI Deepfake Laws in 2025: Your Complete Legal Guide
From the TAKE IT DOWN Act to state-specific protections, here's everything you need to know about laws against AI-generated intimate imagery and your options for justice.
The legal landscape for AI deepfakes has transformed dramatically. What was once a gray area is now clearly illegal under federal law and in most states. Here's the complete breakdown of your legal protections and options in 2025.
The TAKE IT DOWN Act: Federal Protection
Signed into law in 2025, the TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks) creates the first comprehensive federal framework for addressing AI deepfakes.
What the Law Does
- Criminalizes creation and distribution: Publishing non-consensual intimate imagery—including AI-generated content—is now a federal crime
- 48-hour removal mandate: Platforms must remove reported content within 48 hours of a valid request
- Covers AI deepfakes explicitly: The law specifically addresses synthetic and digitally manipulated intimate imagery
- Duplicate removal: Platforms must make reasonable efforts to remove copies of reported content
- Minor protections: Enhanced penalties for content involving people under 18
Criminal Penalties
| Offense | Potential Penalty |
|---|---|
| First offense (standard) | Up to 2 years federal prison + fines |
| With threats/harassment | Up to 3 years |
| Involving minors | Enhanced penalties (case-dependent) |
| Repeat offenders | Increased sentences |
How to Use the TAKE IT DOWN Act
- Document the content: Screenshot URLs, note platforms, save any evidence
- Submit removal request: Use the platform's NCII reporting mechanism
- Specify it's NCII: Clearly indicate the content is non-consensual intimate imagery
- Track the timeline: Platforms have 48 hours to respond
- Escalate if needed: If platforms don't comply, you can file with the FTC or consult an attorney
State-by-State Deepfake Laws
In addition to federal law, many states have enacted their own protections. These can provide additional remedies, including civil lawsuits for damages.
Leading States for Deepfake Protection
California
- AB 602 (2019): Creates civil cause of action for deepfake porn victims
- AB 1280: Extended protections and clarified AI-generated content coverage
- Damages: Minimum statutory damages of $1,500 per violation
- Attorney's fees: Prevailing plaintiff can recover legal costs
Texas
- SB 1361: Class A misdemeanor for creating deepfake porn without consent
- Penalty: Up to 1 year in jail + $4,000 fine
- Also covers: Deepfakes intended to influence elections
Virginia
- First state to criminalize deepfake porn (2019)
- Class 1 misdemeanor: Up to 12 months jail + $2,500 fine
- Amended: To explicitly cover all forms of synthetic intimate imagery
New York
- S.1719 (2023): Creates civil right of action for deepfake victims
- Damages: Actual damages plus statutory damages up to $30,000
- Also includes: Right to demand removal and prevent further distribution
Other Notable States
- Georgia: Felony for deepfakes depicting minors; misdemeanor for adults
- Florida: Expanded NCII laws to cover synthetic media
- Illinois: Civil remedies with potential for significant damages
- Washington: Gross misdemeanor + civil action available
- Minnesota: Criminal penalties + victim compensation fund
Criminal vs. Civil: Which Path to Take?
Criminal Prosecution
Pros:
- Perpetrator faces jail time, fines, criminal record
- Government handles the prosecution (no cost to you)
- Strong deterrent effect
- Public accountability
Cons:
- You don't control the case—prosecutors do
- High burden of proof (beyond reasonable doubt)
- Requires identifying the perpetrator
- Prosecutors may decline cases they deem difficult to prove
- You don't receive financial compensation directly
Civil Lawsuit
Pros:
- You control the case
- Lower burden of proof (preponderance of evidence)
- Can recover significant damages
- Statutory minimums in some states regardless of actual harm
- Attorney's fees often recoverable
- Many defendants settle to avoid public court records
Cons:
- Costs money upfront (though many attorneys work on contingency)
- Requires identifying the perpetrator
- Time-consuming process
- Perpetrator must have assets to collect damages
Many Victims Pursue Both
Criminal and civil cases are separate. You can report to police AND file a civil lawsuit. The outcomes of one don't directly affect the other (though evidence from a criminal case can sometimes help a civil case).
Filing a Criminal Report
Local Police
- Bring all documentation (screenshots, URLs, evidence of the perpetrator's identity if known)
- Specifically cite your state's deepfake or NCII law
- Request a copy of the police report
- Follow up if you don't hear back
FBI IC3 (Federal)
File at ic3.gov. Federal investigation is more likely when:
- Perpetrator and victim are in different states
- Content was distributed widely online
- There was extortion or threats involved
- Multiple victims are involved
What If Police Won't Help?
Unfortunately, some law enforcement agencies still don't take these cases seriously. If you hit resistance:
- Ask to speak with a supervisor or specialized unit
- Contact the prosecutor's office directly
- Reach out to victim advocacy organizations (CCRI, RAINN)
- Consider hiring an attorney who can pressure law enforcement
- File with federal authorities (FBI IC3) as an alternative
Finding an Attorney
What to Look For
- Experience with NCII, deepfake, or cyber harassment cases
- Knowledge of your state's specific laws
- Understanding of platform takedown processes
- Willingness to work on contingency (common in these cases)
Resources
- Cyber Civil Rights Initiative: Maintains a list of attorneys who handle NCII cases
- Victim Rights Law Center: Provides free legal assistance to victims
- State bar associations: Lawyer referral services
Platform Obligations Under Law
The TAKE IT DOWN Act places specific obligations on platforms:
- Must have a clearly accessible process for reporting NCII
- Must respond to valid reports within 48 hours
- Must make reasonable efforts to remove duplicates
- Cannot require victims to provide unnecessary personal information
If a platform fails to comply, they face potential FTC enforcement and civil liability.
International Considerations
If content is hosted overseas or the perpetrator is in another country:
- US platforms still must comply: Major platforms operating in the US follow US law
- Search engines can be delisted: Google and Bing will remove content from US search results
- EU protections: GDPR and the Digital Services Act provide additional remedies for EU citizens
- UK courts: Have shown willingness to issue orders against offshore content
Timeline Expectations
| Action | Typical Timeline |
|---|---|
| Platform removal (after reporting) | 48 hours (legally required) |
| Google search delisting | 1-4 weeks |
| Criminal investigation opened | Days to weeks |
| Criminal charges filed | Months (varies widely) |
| Civil lawsuit filed | Weeks (once you have attorney) |
| Civil case resolution | Months to 1-2 years (often settles) |
Resources
- Cyber Civil Rights Initiative: 844-878-CCRI (2274) — 24/7 crisis helpline
- StopNCII.org: Create hashes to prevent image spread
- FBI IC3: ic3.gov — Federal crime reporting
- RAINN: 800-656-HOPE (4673) — Sexual assault support
- Victim Rights Law Center: victimrights.org — Free legal help
Need Help Navigating the Legal Process?
While you explore legal options, we can help get the content removed from platforms and search engines—quickly and confidentially. Many clients pursue both removal and legal action simultaneously.
Get Free Consultation →About the Author
Sarah focuses on helping victims navigate the content removal process. She writes about digital rights, platform policies, and the legal landscape around non-consensual imagery.