Skip to Content

Nonconsensual deepfake porn puts AI in spotlight

By Donie O’Sullivan, CNN

In its annual “worldwide threat assessment,” top US intelligence officials have warned in recent years of the threat posed by so-called deepfakes — convincing fake videos made using artificial intelligence.

“Adversaries and strategic competitors,” they warned in 2019, might use this technology “to create convincing—but false—image, audio, and video files to augment influence campaigns directed against the United States and our allies and partners.”

The scenarios are not difficult to imagine; a faked video showing a politician in a compromising position; faked audio of a world leader discussing sensitive information.

The threat doesn’t seem too distant. The recent viral success of ChatGPT, an A.I. chatbot that can answer questions and write prose, is a reminder of how powerful this kind of technology can be.

But despite the warnings, we haven’t seen many notable instances, that we know of, where deepfakes have successfully been deployed in geopolitics.

But there is one group the technology has been weaponized against consistently and for several years: women.

Deepfakes have been used to put women’s faces, without their consent, into often aggressive pornographic videos. It’s a depraved AI spin on the humiliating practice of revenge porn, with deepfake videos appearing so real it can be hard for female victims to deny it isn’t really them.

The long-simmering issue exploded into public view last week when it emerged Atrioc, a high-profile male video game streamer on the hugely popular platform Twitch, had accessed deepfake videos of some of his female Twitch streaming colleagues. He later apologized.

Amid the fallout, the Twitch streamer “Sweet Anita” realized deepfake depictions of her in pornographic videos exist online.

“It’s very, very surreal to watch yourself do something you’ve never done,” Twitch streamer “Sweet Anita” told CNN after realizing last week her face had been inserted into pornographic videos without her consent.

“It’s kind of like if you watched anything shocking happening to yourself. Like, if you watched a video of yourself being murdered, or a video of yourself jumping off a cliff,” she said.

But the deeply disturbing use of the technology in this way is not novel.

Indeed, the very term “deepfake” is derived from the username of an anonymous Reddit contributor who began posting manipulated videos of female celebrities in pornographic scenes in 2017.

“From the very beginning, the person who created deepfakes was using it to make pornography of women without their consent,” Samantha Cole, a reporter with Vice’s Motherboard, who has been tracking deepfakes since their inception, told CNN.

The online gaming community is a notoriously difficult place for women — the 2014 “Gamergate” harassment campaign a most prominent example.

But concerns over the use of nonconsensual pornographic images isn’t exclusive to this community, and threatens to become more commonplace as artificial intelligence technology develops at breakneck speed and the ease of creating deepfake videos continues to improve.

“I am baffled by how awful people are to each other on the Internet in a way that I don’t think they would be face to face,” Hany Farid, a professor at the University of California, Berkeley, and digital forensics expert, told CNN.

“I think we have to start sort of trying to understand, why is it that this technology, this medium, allows and brings out seemingly the worst in human nature? And if we’re going to have these technologies ingrained in our lives the way they seem to be, I think we’re going to have to start to think about how we can be better human beings with these types of devices,” he said.

It’s part of a much larger systemic problem.

“It’s all rape culture,” Cole said, “I don’t know what the actual solution is other than getting to that fundamental problem of disrespect and non-consent and being okay with violating women’s consent.”

There have been efforts from lawmakers to crack down on the creation of nonconsensual imagery, whether it is AI-generated or not. In California, laws have been brought in to try to counter the potential for deepfakes to be used in an election campaign and in nonconsensual pornography.

But there’s skepticism. “We haven’t even solved the problems of the technology sector from 10, 20 years ago,” Farid said, pointing out that the development of artificial intelligence “is moving much, much faster than the original technology revolution.”

“Move fast and break things,” was Facebook founder Mark Zuckerberg’s motto back in the company’s early days. As the power, and indeed the danger, of his platform came into focus he later changed the motto to, “Move fast with stable infrastructure.”

Whether it was willful negligence or ignorance, Silicon Valley was not prepared for the onslaught of hate and disinformation that has festered on its platforms. The same tools it had built to bring people together have also been weaponized to divide.

And while there has been a good deal of discussion about “ethical AI,” as Google and Microsoft look set for an AI arms race, there’s concern things could be moving too rapidly.

“The people who are developing these technologies — the academics, the people in the research labs at Google and Facebook — you have to start asking yourself, ‘why are you developing this technology?,'” Farid suggested.

“If the harms outweigh the benefits, should you carpet bomb the Internet with your technology and put it out there and then sit back and say, ‘well, let’s see what happens next?'”

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KTVZ NewsChannel 21 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content