OpenAI is ‘exploring’ how to responsibly generate AI porn

OpenAI, the AI powerhouse behind ChatGPT and other leading AI tools, revealed on Wednesday that it is exploring ways to "responsibly" allow users to create AI-generated porn and other explicit content.

This announcement, found in a detailed document aimed at gathering feedback on product guidelines, has raised concerns, given recent instances of advanced AI tools being used to produce deepfake porn and other synthetic nudes.

Currently, OpenAI's rules mostly ban sexually explicit or even sexually suggestive content. However, the company is now reconsidering this strict prohibition.

"We're exploring whether we can responsibly provide the ability to generate NSFW content in age-appropriate contexts," the document states, referring to content "not safe for work," which includes profanity, extreme gore, and erotica.

Joanne Jang, an OpenAI model lead who helped draft the document, told NPR that the company hopes to start a conversation about whether erotic text and nude images should always be banned in its AI products.

"We want to ensure that people have maximum control as long as it doesn't violate the law or others' rights, but enabling deepfakes is out of the question, period," Jang said. "This doesn't mean we are trying now to create AI porn."

However, it also suggests that OpenAI may one day allow users to create images that could be considered AI-generated porn.

"It depends on your definition of porn," she said. "As long as it doesn't include deepfakes. These are the exact conversations we want to have."

The debate arises amid the rise of 'nudify' apps
While Jang emphasizes that initiating a discussion about re-evaluating OpenAI's NSFW policy, initially highlighted by Wired, does not necessarily imply imminent rule changes, the discussion comes at a critical time for the spread of harmful AI images.

Researchers have recently grown increasingly concerned about one of the most troubling uses of advanced AI technology: creating deepfake porn to harass, blackmail, or embarrass victims.

Simultaneously, a new class of AI apps and services can "nudify" images of people, which has become particularly alarming among teens, creating what The New York Times describes as a "rapidly spreading new form of peer sexual exploitation and harassment in schools."