Revenge Porn Victims, Experts Share Powerful Testimony During Hearing in Dallas

June 27, 2024

Victims advocate for Congress to pass Sen. Cruz’s TAKE IT DOWN Act to force websites to remove explicit images and make publishing such images a federal crime

WASHINGTON, D.C. – U.S. Senate Commerce Committee Ranking Member Ted Cruz (R-Texas) yesterday held a field hearing in Dallas, Texas titled, “Take It Down: Ending Big Tech’s Complicity In Revenge Porn.” During the hearing, victims of revenge and deepfake pornography as well as victim advocates and experts shared powerful testimony about how these incidents forever changed their lives. The witnesses also advocated for Congress to pass Sen. Cruz’s recently-introduced bipartisan Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act.

The TAKE IT DOWN Act would require social media sites and similar websites to remove non-consensual intimate imagery within 48 hours of receiving notice from the victim and would make publishing such material a federal crime. Read more about the legislation HERE.

To watch the full field hearing, CLICK HERE.

Below are highlights from today’s field hearing.

Ms. Elliston Berry, High School Student and Victim of AI-Generated Sexually Exploitative Imagery, Aledo, TX

Ms. Berry on her personal experience with AI-generated sexually exploitative imagery:

“I was fourteen years old when a fellow classmate created AI nudes from just an innocent photo on Instagram. I was fourteen years old when I was violated all over social media. I was just fourteen years old when I feared my future was ruined.”

“As I attended school, I was fearful of the reactions and opinions people had. We live in a society that is built on social media, so I had been convinced at least the whole school had seen these images. And to this day, the number of people that have these images or had seen them is still a mystery. As it took eight and a half months to get these images off Snapchat, that doesn’t wipe the photos off people’s devices. Every day, I will live in fear that these images will resurface, or someone could easily re-create.”

Ms. Berry on the importance of holding Big Tech accountable for the publication of non-consensual intimate imagery:

“As a victim of AI deepfakes, it took a toll on my mental health. AI crimes are just increasing – they’re getting more accessible and more normalized. So far, there’s not much that us as normal people can do. Holding tech companies accountable for allowing these photos to still be up is really important.”

Mrs. Anna McAdams, Mother of Ms. Elliston Berry, Aledo, TX

Mrs. McAdams on how the TAKE IT DOWN Act will protect girls like her daughter:

“This is why the Take it Down Act is crucial. This bill would hold even minors accountable with jail time for this crime. And, it would require Snapchat and other social media apps to take images down within 48 hours. As of two weeks ago, Snapchat had not responded to the warrant issued by our sheriff’s department, nor to any of my requests online. When I met with Senator Cruz’s office two weeks ago, they were able to get ahold of Snapchat and get the accounts and images taken down. It took eight and a half months. If we had been Taylor Swift, they would have come down immediately. This bill gives us a voice we didn’t have before.”

Mrs. McAdams on the need to hold Big Tech accountable for non-consensual intimate imagery:

“As common people that live every day, we need to be able to get ahold of these companies, and get them to take our images down, and not have to know someone in order to protect our children.”

Ms. Francesca Mani, High School Student and Victim of AI-Generated Sexually Exploitative Imagery, Westfield, NJ

Ms. Mani on her personal experience with AI-generated sexually exploitative imagery and the need for federal laws to protect her and other victims:

“What happened on October 20th to me and the other girls is unacceptable. No child, teenager, or woman should ever experience what we have experienced. I initially felt shocked, then powerless and angered by the absence of laws and school policies to protect us. Now, I am determined to push for reforms.

“The obvious lack of laws speaks volumes. We girls are on our own and considering that 96% of Deep Fake AI victims are women and children, we're also seriously vulnerable and we need your help.”

Ms. Mani on the need for schools to update their policies on AI-generated images:

“My school didn’t have any updated AI school policies and I think that’s very important. Because if I had that AI school policy, I wouldn’t be here stating the obvious. AI school policies protect us and our schools, and I think every school in the U.S. should update their policies.”

Ms. Mani on her school’s failure to hold the perpetrators accountable:

“[The boys who did this] are still attending my classes, which is completely unfair. And I just want to say also, my principal is a woman, she’s a mother, and she should be sitting here right next to me fighting for laws protecting her students against what has happened. Just the whole school administration didn’t handle this correctly.”

Ms. Hollie Toups, Victim of Non-Consensual Intimate Imagery, Austin, TX

Ms. Toups on her personal experience with non-consensual intimate imagery:

“But it wasn’t just pictures. There were comments, threats, personal information about me. Immediately, I felt unsafe in my own home and uncomfortable in my own skin. I was terrified, helpless and I just wanted to climb in bed, pull the covers over my head and never come out. I spent the next few days bouncing back and forth between panic, anger, embarrassment, and being completely devastated.”

“For months after, I checked the internet every day to make sure my photos were not back up. Every day, it controlled me. I often think of threats and messages I received on that website and on social media and of others who have gone through the same thing. It is hard to put into a few minutes what that year was like. These actions can inflict long-term harmful psychological, personal, and social repercussions for victims. I have gone through a lot of therapy to get past it. I am immensely grateful for my support system during that time, and for those who fought for me, as I am not sure where I would have ended up without them.”

Ms. Toups on the need to pass the TAKE IT DOWN Act:

“I know there have been variations of laws across states to try to combat this, but I think [THE TAKE IT DOWN Act] really encompasses all victims and it’s catching up with the times because when the laws were beginning to get passed, we didn’t have the AI problem. I think the take down provision is really vital because . . . the sooner you can get them down, the less likely they are to spread. And then also the mention of the threat of the posting. That is just as bad. Living under the threat that someone is going to post [these images] is just as harmful to victims as the posting. I think it’s a great bill and I hope it goes through.”

Ms. Andrea Powell, Advocate and Expert on Sexual Exploitation

Ms. Powell on the need for federal legislation to protect survivors of deepfake abuse:

“I have worked with and alongside many survivors of deepfake abuse here in the United States and globally. I know this to be true: no survivor should stand alone in the face of their own abuse and injustice. Today, I share my testimony with the determination that they won’t have to in the United States much longer. The United States needs federal legislation that creates protection and justice for every survivor. The time is now, survivors are waiting.”

Ms. Powell on the importance of holding Big Tech and websites accountable for taking down non-consensual intimate imagery:

“And I’ve been told directly by tech companies, without legislation, they will do nothing . . . so it’s really about bringing technology and law together to support and listen to survivors.”

Ms. Powell on how the TAKE IT DOWN Act will make an impact:

“Law enforcement can’t serve what’s not in the kitchen. If there’s not legislation on the books on a state and federal level, then they don’t have the capacity to engage. Often times, they’d like to, and they can’t. This legislation creates a through line, and honestly, sends an important message to tech platforms—I don’t care if they’re big or small, to be honest with you—that we are taking this seriously. Up until now, they’ve been in charge of what gets taken down or not. And when good stakeholders get involved, good things happen. Otherwise, it doesn’t. Those 9,000 websites that explicitly are designed for deepfake abuse absolutely do so with the joy of knowing they’re causing harm. They have to be held accountable and so do the platforms that host them.”

Mr. Stefan Turkheimer, Vice President for Public Policy, Rape, Abuse, & Incest National Network (RAINN)

Mr. Turkheimer on the emotional and psychological distress that victims endure:

“The victims of non-consensual intimate image distribution often endure significant emotional and psychological distress. This includes feelings of shame, guilt, anxiety, and depression. In severe cases, the distress can lead to self-harm or even suicide. The emotional toll on victims underscores the need to address these problems and for [the TAKE IT DOWN Act] to become law.”

Mr. Turkheimer on the need for federal legislation to address non-consensual intimate images:

“Criminalizing the distribution of non-consensual intimate images serves as a deterrent against malicious behavior. It sends a clear message that such actions are unacceptable and punishable by law, potentially preventing future incidents and protecting individuals from similar harm.”

Mr. Turkheimer on Big Tech’s refusal to protect victims:

“Intimate images have been around forever. Deep fakes are new. It’s the easy creation and distribution of these images that has created the real problem. We are standing on the precipice of proliferation of these images. We know how harmful they can be. But right now, there is nothing requiring the tech companies to fix the problems they facilitate. Last week I told the story of a federal prosecution for identity theft. A woman had been in a brief relationship with a Navy captain and shared images of herself in the context of that relationship. After the relationship ended, he created a Facebook profile of her, and friended all of her friends and new coworkers, members of her softball team, people at her gym, and shared those photos and more with them. There was enough evidence for a Federal Prosecutor to bring the case and for the jury to convict this person of identity theft, but Facebook, who was asked 400 times to take down the photos, would not because they believed the fake profile was more real than the actual person. They simply aren’t going to fix the problem themselves.

“They need this bill. Survivors need this bill. We need this bill.

“Having legal recourse provides victims with a means to seek justice and hold perpetrators accountable. It also validates the experiences of victims, acknowledging the wrong done to them and offering a pathway to closure and recovery.”

Mr. Turkheimer on the importance of passing the TAKE IT DOWN Act:

“Big tech, little tech, middle sized tech, none of them are really prioritizing the protection of victims. And having a criminal recognition of this crime is a recognition of the harm that it caused. It’s a recognition of the victims that are produced. And to get tech to take these things seriously . . . in order to remove [the images], it requires that criminal element of the charge.”

###