Take It Down Act Enacted to Deter Deepfakes Involving Sexual Content

O

n May 19, President Donald Trump signed the Take It Down Act into law which criminalizes the sharing of non-consensual intimate imagery (NCII) and institutes a mandatory notice and takedown system for all covered platforms to implement. The purpose of this law is to offer protection to victims of online sexual abuse.

Background

Deepfakes are highly realistic images and videos generated by using AI technology to digitally replace one person’s face with another, making it appear as though someone said or did something they never actually did. While deepfake technology has been helpful for creating commercials or generating medical imagery, its rise in popularity and increased accessibility has led to troubling uses, often associated with harmful content including political misinformation, child exploitation, and revenge porn. As the use of deepfake technology to create sexually explicit material featuring non-consensual individuals continues to grow, many states have raced to regulate its distribution. In this year alone, 322 deepfake related bills have been introduced across the states.

The federal government has also become increasingly concerned about the spread of pornographic deepfakes. Senator Cruz drafted the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, now commonly referred to as the Take it Down Act in response a complaint  by a 14-year-old high school student in Texas who was the target of sexual harassment after a classmate shared deepfake nude images of her to her classmates through Snapchat and  Snapchat failed to remove the offending content.

This bill received strong bipartisan support, and President Trump signed the bill into law on May 19 which added a subsection to Section 223 of the Communication Act of 1934 (47 U.S.C. § 223).

Coverage and Implementation

This law makes it a crime to share NCII online. Unlawful content includes images of an adult published without their consent or with intent to harm the adult and published images of a minor to arouse anyone or humiliate the minor. Any AI-generated intimate image shared without the subject’s consent is also prohibited.

There are exceptions to sharing NCII in good faith, such as sharing these images to law enforcement agencies, as part of a legal proceeding, for medical purposes, to report unlawful content, or to seek support after receiving such an image.

This law also requires that online platforms where this NCII may be distributed implement a notice-and-takedown system. Covered platforms include websites, online services, online applications and mobile applications that serve the public and primarily host user generated content while providers of internet access, electronic mail, and services, websites and applications that do not primarily consist of user generated content are exempt.

These platforms must provide a clear process for their users to verify their identity, report NCII, and request the removal of NCII.

To notify and request removal of an NCII, one must:

  1. Have the signature of the identifiable individual;
  2. Proof of the NCII and information on where to find it on the platform;
  3. A statement that the image of the identifiable individual was posted without their consent with any relevant information to show nonconsent; and
  4. Contact information of the identifiable individual.

The covered platforms must implement the takedown procedures by May 20, 2026.

Once a platform receives a valid request for removal, they must remove the NCII and make reasonable efforts to find and remove any copies of that NCII that are on their platform within 48 hours. The only exception for the online platforms under this Act states that they are not held liable for removing an image reported as an NCII in good faith, even if the image is lawful.

Enforcement and Penalties

The penalties for violation are steep and any person who posts the NCII of an adult can be fined, imprisoned for up to two years, or both. Any person who shares the NCII of a minor can be fined, imprisoned for up to three years, or both. The Federal Trade Commission (FTC) is responsible for enforcing the notice-and-takedown procedure. Failure to takedown reported content is classified as an unfair business practice. However, questions remain about the FTC’s capacity to enforce this effectively after President Trump’s attempt to remove two commissioners earlier this year.

Concerns

While this law was passed with good intentions and serves to remedy largely unchecked online behavior, there are concerns about how this new law will be implemented. The language of this act is ambiguous and could lead to privacy violations or online platforms overcorrecting to avoid penalties, potentially removing lawful content.

First Amendment Violations: The takedown provision in the bill is written broadly and could allow for the removal of any content that is reported. The mere 48 hours the platform must take action after receiving these requests may not allow time for a platform to adequately review the request and there is no requirement to verify that content is NCII before taking it down within the 48-hour period or that the platforms even have the authority to review requests before removing content.  Platforms may turn to automatic takedowns, potentially chilling protected speech.

Lack of Safe Harbors: While the Act holds online platforms accountable similarly to the Digital Millennium Copyright Act (DMCA), which protects copyright holders from online infringement, there is no corresponding safe harbors for online service providers if they implement the proper takedown procedures,  or any counternotice opportunity for the person who posted the allegedly infringing material to show that it is not infringing. Without clear safe harbors, platforms are open to liability if they do not comply with the current notice-and-takedown requirements in the act. This lack of protection may also incentivize platforms to remove all reported content, regardless of the merit of the request, to avoid legal consequences.

End to End Encryption: End-to-end encryption is a method of securing private messages so only the sender and recipients can read them. The messaging system itself does not have access to these messages as they are protected by encryption. Apps like WhatsApp, Signal, and Telegram are made for this private messaging and are considered online platforms that primarily host user generated content and are covered by the Take It Down Act. However, the act does not clearly explain how platforms using end-to-end encryption are expected to comply with the notice-and-takedown requirement, as either the platform must break the privacy protections these apps offer or choose not to act and risk facing penalties.

Conclusion

While it is encouraging to see that the government is empowering victims of online revenge porn and nonconsensual manipulations of photos to make them intimate, the Take It Down Act still presents concerns surrounding First Amendment and vagueness issues. The success of this Act seemingly hinges on whether these platforms can find a way to implement this new takedown provision without compromising the integrity of other content on their platforms.

The Take It Down Act is the most recent piece of legislation addressing issues created by generative AI and will likely be instructive for future bills addressing similar AI-related issues.

Filed in: Uncategorized

July 9, 2025