In an educative legal update shared via Instagram, Georgia attorney Stephanie R. Lindsey is urging the public to take notice of a newly enacted federal law that makes it a crime to post or share non-consensual intimate content online — including images and videos generated by artificial intelligence.
The law, widely known as the Take It Down Act, marks a historic step in expanding legal protections for victims of revenge porn and digital harassment.
“This is a federal offense now,” Lindsey stated in her video post. “It is illegal to post intimate images or videos of someone online without their consent — even if those images are created by AI. And once the platform is notified, they have just 48 hours to take it down.”
What the Take It Down Act Means
The Take It Down Act, signed into law earlier this year, criminalizes the non-consensual distribution of explicit images and videos across all digital platforms.
Crucially, it includes content created through AI and deepfake technology — a major concern in an age where artificial intelligence can convincingly replicate someone’s face or body in fabricated intimate scenarios.
The law empowers victims by providing a clear recourse: they can report the content directly to social media companies, which are now required by law to remove such content within two business days. Non-compliance by platforms or continued distribution by users can trigger federal investigations and criminal charges.
“This law helps address a devastating gap in our legal system,” said Lindsey. “Too many people — especially women and minors — have been targeted with these kinds of abusive tactics, and AI has only made the threat more widespread.”
AI Deepfakes Add New Legal Urgency
Legal experts say the inclusion of AI-generated content under the scope of this law is significant. Deepfake technology — often used to create explicit videos using a person’s likeness without their knowledge or consent — has rapidly grown in sophistication and availability, making it easier for perpetrators to commit digital sexual abuse with limited technical expertise.
By classifying AI-generated non-consensual intimate content as illegal, the law reinforces that the harm to victims is the same regardless of whether the image is real or fabricated.
“Just because something is fake doesn’t mean it’s harmless,” Lindsey added. “When your face is used in an explicit video — AI-generated or not — the trauma is real. The damage is real. And now, the consequences are real too.”
Victims’ Rights and Legal Recourse
Under the Take It Down Act, victims can file a request directly with the platform hosting the content, and companies are obligated to act swiftly. In addition, victims can contact federal authorities or an attorney to pursue criminal charges against the person responsible for the upload or distribution.
Stephanie R. Lindsey, a long-time advocate for digital safety and personal agency, emphasized that this law is a major win — especially for vulnerable communities.
“This law is about dignity, consent, and giving people the power to take control of their digital identities,” she said. “No one should have to suffer because someone else decided to violate their privacy.”
A Call for Awareness and Enforcement
While the law is a step in the right direction, Lindsey and other legal professionals stress that public awareness and consistent enforcement will be key.
“The more people who know their rights, the more power we take back,” she concluded. “And the more we hold people accountable for weaponizing technology in this way, the safer the internet will be for all of us.”
As the legal community continues to respond to the intersection of technology and privacy, the Take It Down Act signals a new era in which federal law keeps pace with digital abuse — and begins to deliver justice in the most personal of violations.