New Law Bans Intimate Deepfake Images
President Donald Trump hs officially signed the Take It Down Act into law, marking a significant step in the fight against non-consensual intimate images and deepfake content. The legislation targets both real and AI-generated "nonconsensual intimate images" (NCCI), criminalising their distribution and demanding swift removal from social media platforms.
The law received broad support in Congress, with endorsements from major technology companies, advocates for victims of image-based abuse. However, critics warn that the law presents serious risks, including overreach and potential suppression of free expression.
Key Provisions of the Law
The law stipulates that:
- Publishing NCII, whether genuine or AI-generated, is criminally punishable by up to three years in prison and fines.
- Social media platforms are required to remove reported NCII within 48 hours and make "reasonable efforts" to eliminate all copies.
- The Federal Trade Commission (FTC) is responsible for enforcement, and companies have one year to comply with these requirements.
President Trump stated, "I’m going to use that bill for myself, too,” highlighting his personal stance on online treatment and implying potential use against political opponents and critics.
Supporters & Opponents
While the bill has received backing from some advocacy groups, including the Cyber Civil Rights Initiative (CCRI), concerns have been raised about its implications. CCRI announced it could not endorse the legislation, warning it might give survivors "false hope" and potentially do more harm than good.
Mary Anne Franks, CCRI President, criticised the law, calling the takedown process a "poison pill" that could overwhelm platforms with false reports and discourage effective action against genuine victims. She argued that platforms might become "emboldened" to ignore reports, hampering efforts to combat abuse.
Concerns About Free Speech
Critics also fear the law could be exploited by political figures, as the law’s broad scope might be used, intentionally or otherwise, to stifle speech or target dissent. In particular, social media platforms could face lawsuits or legal challenges if users believe their lawful content was wrongly removed, or if courts overturn the law itself.
Enforcement Challenges
Implementation of the law coud turn out to be comlicated. Since enforcement will be carried out by the FTC, and given the law’s ambiguous language, companies may delay complying or challenge enforcement in courts. As the law states, "users may eventually seek legal action if their lawful content is removed" and could challenge the law’s constitutionality.
Critics are also warning that the law could set a precedent for government overreach, especially if the current administration’s intent is misused for political control. As enforcement begins, legal battles are expected to unfold, shaping the future of content regulation in the digital age.
Conclusion
The Take It Down Act is a major legislative effort to combat the proliferation of deepfake images and non-consensual content online. While intended to protect victims, critics warn that its vague language and potential for misuse could threaten free speech and privacy rights.
As enforcement measures take shape, ongoing legal challenges and public debate are likely to influence how the law is applied and interpreted in the coming years.
The White House | BBC | The Verge | The Guardian
Image: Ideogram
You Might Also Read:
There Are Ways You Can Detect A Deepfake:
If you like this website and use the comprehensive 7,000-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible