Placeholder canvas

OpenAI’s Sora can generate realistic-looking nude videos, developers rushing to apply a fix

OpenAI may be rushing towards a world of hurt and soon face many lawsuits and investigations. OpenAI isn’t ruling out that its forthcoming Sora video generator might create nud videos, and that could be bad news for the company.

In a sweeping interview with the Wall Street Journal about the forthcoming tool, OpenAI chief technology officer Mira Murati suggested that the company hasn’t yet figured out the whole nudity.

“I’m not sure,” Murati told the WSJ’s reporters when asked about nudity. “You can imagine that there are creative settings in which artists might want more control over that. Right now, we are working with artists and creators from different fields to determine what’s useful and the level of flexibility the tool [should] provide.”

It’s a surprisingly candid answer that may have been overlooked during the interview’s now-infamous YouTube training data moment. Still, some experts are worried that if Sora does allow for “creative” nudity, it may open the proverbial porn floodgates.

“OpenAI has a challenging decision to make around this because, for better or worse, the reality is that probably 90 percent of the demand for AI-generated video will be for pornography,” Daniel Colson, the founder and executive director of the AI Policy Institute (AIPI), told Quartz. “That creates an unpleasant dynamic where, if centralized companies creating these models aren’t providing that service, that creates a powerful incentive for the grey market to provide that service.”

Given how easy it is to exploit AI models into providing outputs that go against their guardrails, there’ll almost certainly be people trying to trigger Sora into making porn anyway, which does indeed put the world’s foremost AI firm in a situation in which it’s damned if it does and damned if it doesn’t.

As public polling indicates, people are not only concerned about the use of AI models to generate deepfake porn, as they were with those disgusting images of Taylor Swift earlier in the year, but 86 percent also believe the companies behind such easily exploited tools should be held accountable for their loose guardrails.

“That points to how the public takes this technology seriously,” Colson continued. “They think it’s powerful. They’ve seen how technology companies deploy these models, algorithms, and technologies, leading to completely society-transforming results.”

It’s a salient set of concerns that the general populace seems to understand, so why doesn’t OpenAI?

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Newsletter

Follow Us

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed