Insight
March 5, 2025
TAKE IT DOWN Act: Addressing Free Speech and Privacy Concerns
Executive Summary
- In his address to Congress, President Trump touted the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act, a bill designed to prevent the unauthorized publication and sharing of non-consensual intimate images, including those artificially generated, that has already passed unanimously out of the Senate.
- Specifically, TAKE IT DOWN would criminalize the sharing of nonconsensual intimate visual depictions and require interactive computer services such as social media companies or messaging apps to create a notice-and-takedown mechanism.
- As currently drafted, the bill’s scope is expansive, and would place few guardrails to prevent false claims, fails to specify what constitute “reasonable” efforts to identify and remove such content, and extends to messaging services in which the communications may be encrypted and thus impossible to identify and remove.
Introduction
In his March 4 address to Congress, President Trump touted the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act, which specifically targets non-consensual intimate visual depictions (NCIVD), criminalizing their production online and requiring covered platforms such as social media companies and private messaging apps to remove such content with 48 hours of notice from the affected individual. With the Senate passing the bill unanimously in February and the first lady adding her support for the legislation at the beginning of March, it appears likely the House of Representatives will soon take up the bill.
Despite strong bipartisan support for the underlying bill, the current language could result in the silencing of legitimate speech and threaten key protections for online users such as end-to-end encryption. First, the bill specifically targets the individuals posting NCIVD, but under current law interactive computer services could likewise be criminally liable, potentially leading to an over-removal of content and the silencing of lawful speech. Second, the bill provides few guardrails to prevent individuals from making false takedown requests. Third, the takedown provisions require “reasonable efforts” on the part of the platforms to comply, but this could lead to uncertainty regarding what steps are appropriate. Finally, many private messaging apps covered by the bill use end-to-end encryption; thus, their compliance with these takedown requirements would either be impossible to enforce or require messaging apps to break their encryption.
As the House debates the merits of TAKE IT DOWN, it could look to clarify language in the bill to minimize the negative impacts to free speech and privacy while still providing the tools to victims of NCIVD.
TAKE IT DOWN Act
The Take It Down Act includes two key components. First, the bill would criminalize the publication of NCIVD, including artificially generated images, on social media websites and messaging services, when the NCIVD is posted without the victim’s consent and is intended to cause harm. Further, the bill would create an additional offense for the publication of NCIVD of minors.
Second, the bill would require platforms to create a notice and takedown mechanism for victims of NCIVD, somewhat akin to the takedown mechanism for copyright infringement online. Specifically, the platform must establish a process that allows individuals to notify it about the NCIVD with written statements that the request was made in good faith. Once notified, the platform must remove the NCIVD within 48 hours and make reasonable efforts to identify and remove identical copies of such depiction. Violations by a platform would constitute an unfair or deceptive act or practice and only be enforceable by the Federal Trade Commission (FTC), meaning individuals would have no private right of action against the platforms.
Considerations for Congress
Despite strong bipartisan support for the underlying bill – and unanimous Senate passage – lawmakers should be aware that the legislation may negatively affect free speech and privacy online. As the House debates the bill, members should consider changes to alleviate these concerns.
First, by criminalizing the use of a platform to knowingly publish NCIVD, the TAKE IT DOWN Act could make the platforms criminally liable when it is their users who post NCIVD. Under existing law, platforms are generally held to have knowledge of all the content on their service if they moderate user content. Further, Section 230, the law that protects platforms from liability if they do decide to moderate content, does not extend to federal criminal law. Therefore, under a broad reading of the TAKE IT DOWN Act, a court could find the platform in violation of the statute, despite the bill seemingly creating a separate regime for covered platforms. To rectify this, lawmakers could include clarifying language that the criminal provisions do not extend to the platforms themselves.
Second, the bill’s notice-and-takedown provisions provide few guardrails against false takedown requests, which in turn could result in the removal of legitimate speech. While individuals must provide an electronic signature and a statement explaining they are making the request in good faith, there is little to stop individuals from making false claims, impersonating the individual depicted, or claiming non-NCIVD is actually NCIVD. If a platform receives a takedown request on content that could even potentially be seen as NCIVD, the platform would likely take it down to avoid the threat of violating the law. Moreover, there is almost no recourse for the individual who posted the content to keep that content online. Congress could consider imposing guardrails such as an appeals requirement to allow individuals to appeal the decision of a platform.
Third, the bill requires platforms to make reasonable efforts to identify and remove any known identical copies of such depictions. This provision presents two issues. First, platforms will not know what “reasonable efforts” are, and malicious actors have techniques for bypassing content-filtering hashes, potentially leading to liability for the platform for failure to identify and remove content. Second, the bill’s provisions would extend to many communications apps that provide end-to-end encrypted services, meaning the service doesn’t know what is being shared and cannot remove such content without breaking encryption. Congress could include language that provides a safe harbor for the implementation of certain hashing technologies or exclude encrypted services from the identification and removal provisions, which would add clarity regarding what the bill would require of these services.
Finally, the bill takes a prudent course of action by only leaving enforcement to the FTC rather than creating a private right of action against the platform, as doing so would only exacerbate the concerns about free speech online. FTC officials, however, have begun to solicit comments about the content moderation practices of social media companies and could use enforcement as a tool to drive moderation practices in a way that would otherwise violate the First Amendment. Congress should scrutinize enforcement actions if it appears the FTC has begun to use the law as a pretext for punishing companies for unrelated practices.
Conclusion
The TAKE IT DOWN Act could soon become law, but concerns regarding the bill’s broad language persist. As the House considers the legislation, it should look to clarify language to ensure that while the bill can protect the victims of NCIVD, it does not incentivize platforms to over-remove legitimate content or abridge key privacy features such as end-to-end encryption out of a fear of potential liability.





