The House has overwhelmingly passed the Take It Down Act, voting 409–2 in favor of the bill aimed at combating the growing threat of nonconsensual AI-generated sexual content, commonly known as deepfake porn.
The legislation makes it a federal crime to create or distribute explicit deepfake images or videos without the subject’s consent. It also requires online platforms to remove flagged content within 72 hours of notification.
Victims of such content will now have the legal right to sue creators, distributors, or platforms that fail to comply with takedown demands. Lawmakers say the law is long overdue, given the rapid advances in AI image and video generation.
Backed by President Trump and a rare coalition of bipartisan support, the Take It Down Act is being hailed as a landmark step toward defending digital privacy and human dignity.
Advocates argue that deepfake porn has become an increasingly damaging weapon, disproportionately targeting women, public figures, and minors, often with devastating social and psychological consequences.
“This is about drawing a line,” said Rep. Sheila Jackson Lee (D-TX), one of the bill’s sponsors. “No one should wake up to find their face on a fake, explicit video spreading online without their consent.”
Opponents—just two representatives—voiced concerns about potential free speech implications and overreach. But supporters argue the bill carefully balances privacy rights and platform responsibility.
The Senate is expected to take up the bill within weeks. With strong bipartisan momentum and executive backing, it’s likely to pass, marking a significant shift in how the U.S. handles AI misuse and digital exploitation.