Editor’s note: Laurie Segal is a longtime technology journalist. mostly humanis an entertainment company that produces documentaries, films, and digital content focused on the intersection of technology and humanity. She is the author of “.special character: My adventures with a tech giant and a sociopath. ” Previously, he served as CNN’s senior technology correspondent. Her views expressed in this comment are her own.read more opinion On CNN.
CNN
—
AI-generated sexually explicit photos of pop superstar Taylor Swift flood the internet. please do not I need to calm down.
Angie Speranza
Laurie Segal
Swift may be one of the most famous women in the world, but she represents all women and all girls when it comes to what’s at stake in the future of artificial intelligence and consent.
I’ve been covering the impact of technology for nearly 15 years, and I believe sexually explicit deepfakes are one of the most significant threats we face from advances in AI. The proliferation of AI-generating tools and the Silicon Valley innovation race are moving us closer to technology, but now the stakes are even higher.
We are in an era where not only data is available, but our most intimate qualities are available. Our voices, faces, and bodies can all be imitated by AI. Simply put, our humanity can be used against us with just a click.
And if it can happen to Swift, it can happen to you too. The biggest mistake we make is believing that only public figures are allowed to do this kind of harm. We are currently witnessing the democratization of image generation apps that enable this kind of behavior. Did the person you love reject you? There’s an app for that. Now you can digitally undress her and create explicit deepfakes featuring her.
The problem will only get worse as we move to an augmented virtual world. Imagine an immersive environment where a scorned ex-lover invites others to watch a sexually explicit deepfake video of the girl who rejected him with her. Earlier this month, it was reported that British police were investigating the case of a 16-year-old who was allegedly raped by multiple attackers in a virtual world.
I recently spoke with Dr. Mary Ann Franks, a professor at George Washington University who specializes in civil rights, technology, and free speech. She issued a chilling warning that these kinds of apps and her AI tools could create a new generation of young people with a mindset of “my wishes are the orders of the AI.” . If we are not careful, we will not only create a new generation of victims, but also a new generation of abusers.
“We just created all these tools, and they’re just being used by young people who are confused, resentful, and angry.” [them] “Instead of trying to figure out what it means to deal with rejection in a healthy way,” Franks said.
Using advances in technology to shame women is nothing new. In 2015, I created a series on her CNN called “Revenge Porn: The Cyber War on Women.” At the time, non-consensual pornography was rampant, with scorned former actors and villains publishing naked photos of women on websites designed to humiliate them. Just like now, the law had not yet caught up and technology companies had not yet made changes to protect victims.
During that investigation, I’ll never forget seeing a website hosted on the dark web that featured non-consensual pornography of teenage girls. A security researcher who specializes in online abuse (and tracking abusers) showed me the depth of the problem through forums and images I would never see. On one site, perpetrators compromised teenagers’ webcams and forced them to perform sexual acts by threatening to send the recorded private images to all of their classmates if they did not comply.
Fast forward to 2024. Imagine your teenager receiving sexually explicit videos of themselves in their DMs. They never recorded the video, but advances in deepfake technology make it impossible to tell if it’s real or fake. In a world where AI has made fiction so believable, the truth and our perception of it are not that far apart. The feelings of shame, loss of control, and helplessness remain because the images and videos are not technically “real.”
Swift’s deepfake nightmare is just the tip of the iceberg highlighting the existential threats facing women and girls. Although X may have removed the viral post (after it was viewed tens of millions of times), there are still many alternative sites that specialize in this type of exploitative content. One particular site, which receives millions of views per month, has a page of sexually explicit deepfake videos featuring Swift and other celebrities who have not consented to their likeness being used for pornographic purposes. ing.
Putting the genie back in the bottle is difficult, and treatment comes at a price. On Saturday, searches for Swift were blocked on X, and the company told CNN the move was temporary to “prioritize safety.”
In order to protect one of the most famous women on the planet, X had to temporarily turn her invisible. Although this is a temporary movement, this message has a lasting impact. If one of the most famous women’s girlfriends has to disappear online for safety, what does that mean for the rest of us?
I’ve been thinking a lot about what actually moves the needle.
From a policy perspective, a small number of states have enacted laws prohibiting the creation and sharing of these types of sexually explicit deepfakes. All of these laws vary in scope, so they make a difference in where you can be charged. For example, if Swift were to apply as a resident of New York state, New York state law requires proof of intent to harm victims of this type of abuse, but AI-generated sexually explicit This is becoming increasingly difficult with images, Franks said.
“As with other forms of image-based sexual abuse, intent to harm is a very specific requirement because there are many other motives. [including] “For sexual gratification, to make money, to gain notoriety, to gain social status,” Franks said. “Laws that require intent to harm give all those perpetrators a free pass.”
To file criminal charges, Swift would need to track down the culprit, which is expensive, difficult to accomplish, and risks further exposure. This illustrates the reality that even in states with existing laws, the path to prosecution is extremely complex.
“People like Taylor Swift have lawyers and people who can help them with this,” Franks said. “The average victim will not receive any assistance.”
Get our free weekly newsletter
Franks cited the bipartisan Intimate Image Deepfake Prevention Act, which criminally prohibits the release of sexually explicit digital images without consent and provides civil remedies for victims, as an ideal example. He said the federal bill would include criminal and civil penalties. Laws banning deepfakes are difficult to enforce, so a Vermont lawmaker recently introduced a bill that would make developers of generative AI products liable for producing reasonably foreseeable harm. These are first steps, but technology is developing faster than legislation.
It doesn’t seem fair to ask Swift to be our spokesperson on this matter, but I want her, and the powerful coalition of fans who share her cause, to create meaningful change starting now. We strongly believe that this may be the best bet to build the momentum needed for . . Don’t get me wrong, there would be huge hurdles and personal costs for any woman, even one in Swift’s position, but she’s already reinvented the music industry and through touring. I don’t think so, considering that we created a micro-economy. Something past her.
In Swift’s world, injustice is a stepping stone, heartbreak is an anthem, and every disappointment is an opportunity for growth. Hopefully, we can use this moment to collectively raise our voices online to sing the ultimate body consent ballad.