The Key To Stopping Revenge Porn Just Might Be Melania Trump
The first lady is advocating for a federal bill criminalizing the online posting of intimate imagery, both real and fake, emphasizing the ‘heartbreaking’ impact on teenagers, particularly girls.

The locker room was supposed to be safe. A shower, a towel, the comfort of routine — until a phone clicked: teenage bickering and jealousy taken to a dangerous extreme.
By the time the then 16-year-old school athletics star, whose family requested anonymity, stepped into the hallway, the nonconsensual shower image was already spreading. Parents, police, and lawyers were summoned. The college hopes of the perpetrators crumbled. The psychological torment lives on.
This is hardly an isolated story.
Revenge porn, or nonconsensual porn, also referred to as NDII, is the distribution of intimate images without consent. While not always driven by revenge, it’s often used as retaliation after failed relationships.
With sexting is on the rise, so is this form of abuse — fueled by an estimated 2,000 dedicated websites and countless victims whose private images remain exposed. There is no federal law that outlaws this abuse.
The First Lady has made it her platform to change that.
“This toxic environment can be severely damaging. We must prioritize their well-being by equipping them with the support and tools necessary to navigate this hostile digital landscape,” Mrs. Trump said earlier this month during a roundtable while promoting the “Take It Down Act” at the United States Capitol.
The bipartisan legislation introduced last year by Senators Cruz and Klobuchar aims to combat nonconsensual intimate imagery, including artificial intelligence-generated deepfakes, by criminalizing its distribution and requiring platforms to remove such content within 48 hours.
The bill passed the Senate unanimously in December 2024 but stalled in the House, leading to its reintroduction in January 2025 by Congresswomen Maria Salazar and Madeleine Dean. As of March 2025, the Act remains under House consideration, reflecting ongoing bipartisan efforts to address online exploitation.
Mrs. Trump is advocating for a federal bill criminalizing the online posting of intimate imagery, both real and fake, emphasizing the “heartbreaking” impact on teenagers, particularly girls. In her first solo public appearance since leaving the White House, the First Lady urged Congress to prioritize youth well-being.

The Take It Down Act would make it a federal crime to publish or threaten to share intimate images, including AI-generated ones, without consent. It also requires social media platforms to remove such content within 48 hours and prevent its reappearance upon request.
“Revenge porn is a huge problem that is only getting worse,” a retired Supervising Investigator for San Diego County, Cathy Garcia, tells The New York Sun, “Smartphone technology is the main culprit…. With great information exchange and many positive aspects, porn is one of the negative aspects. Users of technology have gotten younger and younger, causing a wider spread of victimization.”
An Epidemic of Exploitation
One in 25 Americans has been threatened with or exposed to nearly nude or explicit images without consent, according to the Center for Innovative Public Health Research. As the rise of sexting intersects with digital abuse, revenge porn has become a widespread crisis — particularly for girls and young women.
The real numbers are likely higher.
“It’s very prevalent, but there’s no solid number on it because it’s underreported. A lot of victims comply with demands out of fear or embarrassment, or they beg their exes to take the content down rather than going to court,” Sexual Assault Investigative Training & Consulting specialist and retired San Diego Police Department Detective Carlton Hershman tells the Sun.
“Many avoid legal action because they fear the images will be shown in front of a jury.”
Research by the Cyber Rights Organization underscores that 90 percent of revenge porn victims are women, with the majority having shared intimate imagery with a trusted individual, only for it to be used against them.
The problem soared to epidemic proportions during the Covid pandemic. The isolation caused by lockdowns, along with rising unemployment and stress, created a fertile ground for tech abuse, making people more vulnerable to nonconsensual pornography. As social distancing redefined relationships, the nude selfie became a substitute for sex, amplifying the risks.
Alarmingly, this issue is affecting younger generations — 15 percent of Kindergarten to Grade 12 students report knowing of a combat nonconsensual intimate imagery incident involving a peer.
Ms. Garcia stressed that “the only way to keep nude images private is to not take them at all,” but our changing technological landscape adds another layer of complexity to the matter.
The ascent of “deepfake technology” has only worsened the problem. A report from Sensity AI found that 90 to 95 percent of all deepfake videos online were nonconsensual porn, almost exclusively targeting women. As digital exploitation surges, urgent legal and technological measures are needed to curb this growing crisis.
Texas teen Elliston Berry joined Mrs. Trump on the panel, explaining that in 2023 — at just 14 years old — she woke to messages from a friend informing her that AI-generated nude images of herself were circulating on social media, altered from a past Instagram photo. Shocked, scared, and filled with shame, Ms. Berry initially blamed herself and struggled to tell her parents.

Despite her family’s efforts to get the images removed from Snapchat, they received no help. It wasn’t until Senator Cruz’s office intervened that the photos were taken down. Ms. Berry and her family now advocate for holding Big Tech accountable for the harm caused by such exploitation. The Act seeks to address this by forcing social media companies to act swiftly once a complaint is filed.
Murky Legalities
While there is a lack of federal regulation, which the Take it Down Act endeavors to address, almost every state, except South Carolina, has laws against revenge porn, but they differ widely in how they define and punish it.
While 38 states have laws banning nonconsensual pornography, many don’t cover AI-generated images, like deepfakes. In some high-profile cases, victims have received large settlements, such as a Texas woman who was awarded $1.2 billion in 2024 after her intimate photos were posted without consent.
Many states, however, don’t have laws protecting against deepfake images, especially those involving children, and there’s currently no federal law for AI-generated child abuse material.
While Virginia, Texas, and California have updated their laws to include deepfakes, most victims remain unprotected.
Helpline Practitioner at the Revenge Porn Helpline, Amanda Dashwood, tells the Sun that the preferred term is “intimate image abuse,” as the word “revenge implies the victim has done something to deserve this abuse.”
“Since the helpline’s launch ten years ago, we have received over 400,000 intimate image reports, covering more than 60,000 individual cases, with an average annual increase of 57 percent in reported incidents,” she continued. “Reports of synthetic image abuse have jumped by 400 percent since 2017, but it still makes up a small portion of overall intimate image abuse. Unfortunately, this abuse shows no sign of decreasing.”
As the issue of nonconsensual pornography gains more attention, concerns over the impunity of perpetrators continue to rise.
A stark example is former Democratic candidate for the Kansas House of Representatives, Aaron Coleman. Five years ago, the then 19-year-old admitted to threatening a 13-year-old girl by sharing explicit images of her and ultimately following through with the dissemination of revenge porn. Despite this revelation, Mr. Coleman beat a seven-term incumbent and served as a Kansas House of Representatives member from the 37th district.
In November 2021, he was arrested on suspicion of driving under the influence, just a month after facing arrest and charges for domestic battery. Mr. Coleman’s term in the Kansas State House ended in January 2023 after he chose not to seek re-election.
His case underscores the more significant issue of insufficient consequences for those involved in revenge porn.
“Juries and society quickly victim blame, ‘she shouldn’t have been naked in their house. She shouldn’t have sent them body images or allowed them to take body images of her.’ It is important to show the negative impact on the victim when taking an image case before a jury,” Ms. Garcia noted. “There also can be challenges as many times the victim and perpetrator were intimately involved.”
Despite the profound harm caused by nonconsensual pornography, penalties for offenders vary significantly across America.
In California, first-time offenders may face fines and community service, with repeat offenses potentially leading to jail time. Texas classifies such offenses as state jail felonies, with fines contingent on the perpetrator’s intent. Washington, D.C., allows penalties of up to 180 days in jail for the distribution of intimate images without consent.
Nevada treats these offenses as felonies, while Pennsylvania’s penalties differ based on the perpetrator’s age.
This inconsistency highlights the urgent need for standardized, robust protections against nonconsensual pornography nationwide.
Meanwhile, victims of nonconsensual pornography often experience trauma similar to that of sexual assault survivors. Along with psychological harm, victims face severe consequences such as public exposure of their personal information, threats, harassment, job loss, and even forced name changes. These experiences highlight the deep and lasting impact of revenge pornography.
Concerns Over Data Privacy
Yet not everyone is for the bill.
Digital privacy experts have cautioned that, although progress is necessary to address real and artificially created nonconsensual intimate images, the bill’s absence of clear protections could jeopardize user privacy and freedom of speech.
“In its current form, the bill creates a notice and takedown (NTD) mechanism that would result in the removal of not just nonconsensual intimate imagery but also speech that is neither illegal nor actually NDII,” a group of leading digital privacy organizations wrote in a letter to the Senate last month. “This mechanism is likely unconstitutional and will undoubtedly have a censorious impact on users’ free expression.”
Mr. Hershman cautioned that there are pluses and minuses to making this a federal offense.
“The FBI doesn’t want to handle it because it would add to their already heavy workload, and these cases can be difficult to investigate,” he explained.
“State law enforcement would likely respond more quickly. However, the challenge at the state level is that very few officers are trained in handling these cases. Obtaining search warrants is another issue, especially when dealing with international companies. A potential federal law could help if it enabled collaboration between U.S. law enforcement and foreign agencies.”
From his purview, “the best outcome for victims right now is fast removal before the content spreads.”
“However, that doesn’t prevent someone from reposting it on a different platform, potentially in another country. I’ve had cases where content was removed, only to resurface a year later from an entirely different source, making it nearly impossible to track and enforce,” he continued. “Until stronger legal frameworks are in place, the focus remains on damage control rather than long-term prevention.”
An Unknown Future
The Take It Down Act, if passed, would offer protection to victims of nonconsensual intimate image abuse, including both real and AI-generated “deepfake” pornography. Observers claim that the bill would likely have been law already if it had not been for a last-minute objection by Senator Booker of New Jersey last year.
Mr. Booker offered no explanation for his objection, leaving the Act’s champion — Mr. Cruz — seething on the Senate floor.
The bill’s proponents now hope that, with renewed attention from Mrs. Trump, “Take It Down” will make it into the legal books once and for all.
“Current laws fail to protect victims long-term. Legal loopholes let criminals retain images even after conviction. Court-ordered deletion and the destruction of devices containing NCII are essential to protect survivors from further abuse,” Ms. Dashwood added. “The Revenge Porn Helpline has found over 30,000 intimate images that can’t be removed due to legal gaps, international boundaries, and non-compliant websites.”