Trump Signs Take It Down Act Criminalizing Non-Consensual Deepfakes

Generated by AI AgentCoin World
Tuesday, May 20, 2025 6:52 am ET1min read

President Donald Trump signed the "Take It Down Act" into law, marking a significant milestone in the battle against non-consensual sexually explicit deepfakes. The legislation, which received overwhelming support in both the House and Senate, imposes federal criminal penalties on individuals who distribute intimate images without consent. It also mandates that social media companies promptly remove such content when alerted and empowers the Federal Trade Commission to enforce these regulations.

The law addresses growing concerns over the misuse of AI to create realistic but fake sexual content, often targeting women and public figures. Co-sponsored by Sens. Ted Cruz and Amy Klobuchar, the bill has been hailed as a long-overdue protection against online abuse. Cruz, who attended the signing ceremony, described the act as a historic victory for victims of revenge porn and deepfake image abuse, ensuring that those who exploit new technology to post such material will face criminal consequences. Big Tech companies will no longer be able to ignore the spread of this harmful content.

Trump highlighted the severity of the issue, noting that with the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This legislation aims to provide a legal recourse for victims, allowing them to take action against those who share their images without consent.

The Take It Down Act is one of the most prominent tech laws signed during Trump’s second term, demonstrating how a tech law can break through Congressional gridlock. However, it remains to be seen whether this will be the only tech regulation to achieve such bipartisan support. While numerous industry and victim advocates have endorsed the bill, some digital rights organizations have expressed concerns about potential privacy and free speech infringements.

The law requires platforms to remove images and videos, including deepfakes generated by artificial intelligence, within 48 hours after a victim's request. This provision ensures that harmful content is swiftly addressed, providing victims with a measure of protection and recourse. The act also underscores the importance of holding both individuals and platforms accountable for the distribution of non-consensual explicit content, setting a precedent for future legislation in this area.

Comments



Add a public comment...
No comments

No comments yet