Free Speech Coalition V Reno Centre Issues Notice To X Over Obscene Content Generated By Grok AI Seeks Action Within 72 Hours 509247 2026 01 02 Atis Versus Obscenity-generated Content Meaning Explore the evolution and legal frameworks of obscenity law in digital media, highlighting challenges, regulations, and future trends shaping online content Although free-speech challenges to nonconsensual pornography laws have typically failed, the Act’s broad notice-and-removal requirements present a different analysis for courts that could yield an alternate outcome. The exclusion does not apply to platforms that make NCII available in their ordinary course of business. A valid notice-and-removal request must include the following, in writing: a physical or electronic signature of the affected individual (or an authorized representative), identification of the specific intimate visual depiction, information sufficient to locate the depiction on the platform, a brief statement asserting a good-faith belief that the depiction was distributed without consent, and sufficient contact information for the individual. In general, covered platforms should take proactive steps to implement and test a conspicuous notice-and-removal process for users to report and remove any nonconsensual intimate visual depictions prior to the May 19, 2026, deadline. NCII is intended to cover both authentic and digitally forged (e.g., AI-generated) content. Good faith disclosures of NCII are also permitted in specific contexts, such as disclosures to law enforcement, in legal proceedings or document production, in connection with medical education, diagnosis, or treatment, for the reporting of unlawful conduct, or when seeking support after receiving unsolicited intimate visual content. the takedown provisions may apply to many kinds of content that are not “NCII” as otherwise defined by criminal law, opening the possibility of frivolous requests and/or removal of lawful content without sufficient protections in place). Individuals who distribute authentic or digitally forged NCII involving adults are subject to fines, imprisonment for up to two years, or both. If the depicted individual is a minor, the maximum term of imprisonment increases to three years. Upon receiving a valid removal request from an identifiable individual or an authorized representative, a covered platform must remove the reported NCII within 48 hours and make reasonable efforts to remove all known identical copies of the content. The FTC is responsible for enforcing the Act’s notice-and-removal requirements as they apply to covered platforms. A covered platform that fails to “reasonably comply with” these obligations will be deemed in violation of the FTC Act’s prohibitions on unfair or deceptive acts or practices (UDAP) under Section 18(a)(1)(B) of the FTC Act. Covered platforms are required to provide clear and conspicuous notice of the notice-and-removal process directly on their platform. Obscenity law in digital media has evolved considerably alongside the rapid advancement of technology and online communication. As society grapples with defining what constitutes unacceptable content in a borderless digital environment, legal frameworks continue to adapt Additionally, NCII is addressed in various state laws related to privacy, deepfake reporting, and content moderation, creating overlapping legal regimes with different requirements and definitions. Nannotax3 - ntax_main - Haptolina A “covered platform” is defined broadly as a “website, online service, online application or mobile application” that “serves the public” and “primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files. Moreover, challengers are likely to argue that Section 230 provides immunization from certain penalties, such as the prescribed UDAP penalties. Threats to distribute authentic NCII with the intent to intimidate, coerce, extort, or otherwise cause distress are subject to the same penalties. Threats to distribute digitally forged NCII may also face fines and imprisonment—up to 18 months for offenses involving adults and up to 30 months for offenses involving minors. Notably, the TAKE IT DOWN Act extends the FTC’s enforcement authority to include nonprofit organizations, which are ordinarily outside the scope of the FTC Act. The Act regulates speech in the form of visual depictions on the basis of the content of those depictions. Scope and ApplicabilityThe TAKE IT DOWN Act (the “Act”) applies to individuals and "covered platforms". The Act contains potentially problematic provisions and loopholes, mostly stemming from definitional issues related to NCII. Image found to be AI-generated, viral video is from an unrelated case dating back to August 2024. Key Provisions: Takedown ProceduresCovered platforms are required to implement a notice-and-removal process for reporting and taking down NCII. Next Steps/ConclusionLegal challenges to the Act’s provisions are expected to implicate First Amendment questions as well as Section 230 of the Communications Act of 1934 (47 U.S.C.§ 230). It is unclear whether Section 230 immunity survives the Act. Though there is broad support for federal legislation addressing NCII among lawmakers, civil society organizations, NCII victims, and the public, disagreement remains around whether the TAKE IT DOWN Act effectively targets the problem.