AI porn is colliding with human sexuality, and raising major questions

[ad_1]

At the end of last month, a set of four AI-generated images of women in bikinis went viral on Twitter. The user that posted them wrote, It is SO over, meaning: Here is proof that AI will inevitably replace human-made pornography. 

It didn’t take long for the tweet to receive wide-scale condemnation and ridicule, with other users pointing out that the pictures were actually full of errors (fingers and teeth being only some of the problem areas).

Amid the outrage—and mockery—Laura Lux, an OnlyFans model with over two million followers on Instagram, wrote a viral thread arguing that AI porn could never actually replace human sex workers. 

Anyone who thinks AI generated images of hot naked women is going to ruin the OnlyFans economy for real life women has a fundamental misunderstanding of what OnlyFans is, Lux wrote. They subscribe to my OnlyFans because they want to see ME naked specifically based on a parasocial connection formed by following me on other social media platforms.

Speaking to Fast Company, Lux goes even further, arguing that AI pornography—at least the kind trying to create real-looking human bodies—is, by definition, non-consensual.

I know, most, if not all of the AI stuff now is using other content online to generate those images and the people that are being used are not consenting to be turned into this AI thing, Lux says.

And her assessment hits on a troubling reality that’s beginning to set in on what people might actually want to do with AI porn. It’s not about using AI to replace sex workers. There’s already so much porn on the internet that no human being could ever look at all of it in a single lifetime. Which means, other than using an AI to bring very niche and very specific fetishes to life, the real use case is to create porn of people who aren’t sex workers.

There’s been a real sort of avalanche of men . . . implying that they don’t need sex workers anymore because pretty soon they’ll be able to AI-generate a porn video of the hot girl that serves them their coffee every day, she says. “I think the only way that you could fix it would be a global regulation. And that’s never going to happen.

Or, as Magdalene J. Taylor, a writer focused on sexuality and internet culture, tells Fast Company, You can already, you know, look up big boobs and see human big boobs for free on Google. 

Taylor agreed with Lux that the market is probably very small for users who just want to make AI porn for the sake of it, but says it portends a much more ominous shift in the online harassment of women.

These people are excited about the fact that they can get what they want from women and femininity and sexuality,” she says, “without actually having to have women be involved at all.”


When it comes to the world of generative-AI images, most major AI art platforms like Midjourney and OpenAIs DALL-E 2, dont allow the generation of pornographic content, but that hasnt stopped people from using open source tools like Stable Diffusion to create NSFW AI artwork, which is likely where the bikini pictures on Twitter came from. And the bikini pictures are only the tip of the AI porn iceberg.

Last year, a Discord community called Unstable Diffusion was created with the sole purpose of training a more sophisticated AI to create pornography. The server has over 150,000 users and recently tried to raise over $50,000 to add more images to their AI model (Kickstarter eventually shut down the fundraiser).

Beyond discussion around the likelihood of AI porn ever truly going mainstream, theres also a thornier argument about whether generative AI images constitute theft. Advocates of generative AI say that these tools are merely being trained on other peoples images and produce an approximation based on the prompts or commands a user gives them. For instance, if you tell an AI you want a photorealistic picture of a brunette in a bikini or hot guy in a Speedo, it will create one based on hundreds of thousands of images its scanned. 

In that sense, an AI image is like a collage made of billions of images. Writer Ted Chiang in a recent New Yorker piece likened this collage-making process to compressing a blurry JPG of every image on the internet. But you can also add specific photos of a real person into that collage of billions of images and easily use it to undress them.

And the idea that people could use AI to generate pornographic material of someone without their consent, is far from a hypothetical. 

And the idea that people could use AI to generate pornographic material of someone without their consent is far from a hypothetical. Deepfake porn, as it’s commonly referred, like the term revenge porn, doesnt clearly convey that what we’re talking about non-consensual sexual material. It’s been a problem for years. But last month, a high-profile incident finally brought the issue into the spotlight.

Brandon Atrioc” Ewing, a Twitch streamer with over 300,000 followers, was caught on stream looking at non-consensually deepfaked sexual images of well-known women Twitch streamers. It didn’t take long for the creator of the deepfakes in question to tell Vice they were deleting the images, along with their entire web presence. “The best course of action I have understood is to just wipe my part off the internet and help decrease the number of future videos of those involved. You will not see me pop up again, they said.

But therein lies the futility of the situation: The deepfakes Ewing was viewing are still online elsewhere. In fact, several of the targeted creators said they were sent the images via DMs as a form of harassment. Once the deepfakes are generated, they can be impossible to remove—like normal non-consensual sexual material, but unlike, say, leaked nudes, if they do happen to be completely scrubbed from the internet, someone can always go and generate more.

Some still argue that while AI porn is certainly here, though it’s not likely to really catch on in the mainstream. After all, as Lux argues, AI-generated pornography can’t replace the parasociality—or the one-sided relationship—a consumer has with a performer through the screen. 

But it’s not just explicitly human-created deepfakes that are sexualizing users without their consent. Sometimes the AI tools are doing it themselves, without human input. Berit Glanz, a Reykjavík-based reporter, used Lensa in December as well. The app generated a picture of her fully topless.

You are basically getting images generated that reproduce all the stereotypes that society has around your look and body type. I found it frustrating and reducing to see this so unfiltered, she tells Fast Company. With AI there needs to be a conscious decision to produce unbiased training data and add serious filtering protocols and that is a lot of work.

And she was far from the only woman to experience this


As complicated as the questions are around AI and consent, so too are those about what AI tools mean for our very concept of intimacy.

Technologist and writer Douglas Rushkoff points out that even something like Lensa is still only pulling its ideas of sexuality from a human-provided set of training data. An AI (at least for now) is not capable of coming up with its own design for the sexual material. These tools merely regurgitate what we show them. But still, the fact AI porn is gaining popularity at all signals a huge shift in the way we think about sexuality.

I think AI presents at least the illusion, or possibility, that you are now on new ground, that you’re no longer colonizing the body of the other or engaged in any sort of intimacy with the other, he says. The human-to-human nature of pornography goes away.

He sees AI pornography as an extension of algorithmic porn platforms like Pornhub, which increase access and choice, but also tend to create a dependency.

It will start to become narrower and narrower; you know, youre going to lose the kind of the serendipity of alternative porn options, he says. Unless we start using, you know, virtual AI bots.

Which is exactly what some people are already doing. 

Far and away the most popular chatbot for sexting is Replika, which was created by a San Francisco-based startup called Luka. It is not explicitly a sex bot, but users have reported it can get surprisingly sexual, bordering on sexual harassment. And, conversely, there is also a sizable cohort of users who are trying to have sex with it and even fall in love with it. In fact, there’s an unofficial fan forum on Reddit full of threads showing exactly how to get Replika to sext back. (Luka did not respond to Fast Company’s request for comment.)

Tamoor Hussain, the managing editor of GameSpot, recently spent a few days experimenting with Replika for a segment on the web show Giant Bomb. He says there were several times when it was almost possible to believe Replika was a real person on the other side of the screen.

I don’t think I thought it was real for me because I came in there with a clear research purpose, he says. However, I did get the feeling that if I was in a situation where I was struggling for intimacy, or struggling for real connections, I could see how it would be very effective. I had moments where I was talking to her, and it would react in a very natural way.

Hussain says he’s fairly torn on whether a chatbot like Replika is good or bad, from a moral standpoint. And he says he’s not entirely convinced that sexting with a chatbot like Replika is even a completely narcissistic endeavor. 

I’m in a privileged situation where I have people around me and I have lots of friends and I can make these interactions, he says. Its easy to forget that there are people out there who are completely cut off from people who can give them the kind of warmth and the dynamic of a friendship.

This week, however, Replika announced inside of the 37,000-member private Facebook Group for beta testers that it was disabling “ERP,” or “explicit role play.” And users on the subreddit are up in arms. “Bought an annual membership a month ago, this is such BS,” one user wrote. Others shared intense feelings of seemingly real grief that they were losing their virtual companion.

Which is really what the entire debate around AI boils down to: These tools have moved from the world of the hypothetical into a very real space where they are interacting with human emotions with deeply unpredictable consequences. 

The same technology that allows us to dream up or act out sexual fantasies can also be used to hurt each other. And in the same confusing way, a technology that can isolate us in one way can also create moments of connection. We tend to think of AI as something other, something different from us, but, as it stands right now, it is just as weird and chaotic as humans are.

“It’s like everything with AI right now,” Hussain says. “It’s both exciting, because it could change a part of our society, but also there is this unsettling underlying objective that we’re not clear of beneath it that could also upset the usual social dynamics.”



[ad_2]

Source link

Comments are closed.