Skip to content

OpenAI’s Sora Is a Giant ‘F*ck You’ to Reality

AI companies really seem like they're racing to make our collective online disinformation problem terminal.

Everybody knows that online disinformation is a huge problem—one that has arguably torn communities apart, manipulated elections, and caused certain segments of the global population to lose their minds. Of course, nobody seems particularly concerned about fixing this problem. In fact, the institutions most responsible for online disinformation (and thus, the ones most well-placed to do something about it)—that is to say, tech companies—seem intent on doing everything they can to make the problem exponentially worse.

Case in point: OpenAI launched Sora, its new text-to-video generator, on Thursday. The model is designed to allow web users to generate high-quality, AI videos with just a text prompt. The application is currently wowing the internet with its bizarre variety of visual imagery—whether that’s a Chinese New Year parade, a guy running backward on a treadmill in the dark, a cat in a bed, or two pirate ships swirling around in a coffee cup.

At this point, despite its “world-changing” mission, it could be argued that OpenAI’s biggest contribution to the internet has been the instantaneous generation of countless terabytes of digital crap. All of the company’s tools are content generators, the likes of which, experts have warned, are primed to be used in fraud and disinformation campaigns.

In its blog post about Sora, OpenAI’s team openly acknowledges that there could be some potential downsides to their new app. To address those downsides, the company says its working on watermarking technologies that would flag content that its generator has created. It’s also interfacing with knowledgeable people to figure out how to make the inevitable deluge of AI-generated crap that Sora will unleash less toxic. Sora isn’t open to the public yet and, in the meantime, OpenAI says it will create systems that will deny users who want to generate violent or sexual imagery. The statement notes:

We’ll be engaging policymakers, educators and artists around the world to understand their concerns and to identify positive use cases for this new technology. Despite extensive research and testing, we cannot predict all of the beneficial ways people will use our technology, nor all the ways people will abuse it.

This framing of the problem—as if OpenAI isn’t quite sure how its app could possibly be misused—is sorta hilarious, since it’s already totally obvious how it will happen. Once it goes live, Sora will generate fake content on a gargantuan scale—some of which will likely be used for the purposes of disinformation, some of which will—it seems undeniable—be used to aid in a variety of fraud and scams, and some of which will be used to generate toxic content of one sort or another. All of this content will flood social media channels, making it harder for everyday people to distinguish between what’s real and what’s fake, and making the internet, in general, a whole lot more annoying. I don’t think it requires a global panel of experts to figure that out.

OpenAI has said that it wants to put meaningful limits on violent and sexual content produced by Sora, but web users have shown how savvy they can be at jailbreaking AI systems to generate the kinds of content that disobey companies’ use policies. It doesn’t seem out of the realm of possibility that the same will be true here.

There are a number of other obvious downsides to this app. For one thing, Sora—and others of its ilk—probably won’t have the greatest environmental impact. Researchers have shown that text-to-image generators are significantly worse, environmentally speaking, than text-generators, and just creating an AI image takes the same amount of energy as it does to fully charge your smartphone. For another thing, new text-to-video generation technologies will likely hurt the video creator economy, because why should companies pay people to make visual content when all that’s necessary to create a video is clicking a button?

As such, Sora feels like a big “fuck you” to reality, but it also a fuck you to tons of other stuff too. As far as the corporate class in this country goes, nothing really matters except money. Fuck the environment, fuck artists, fuck an internet that is disinformation-free, fuck the health of political discourse, fuck anything that gets in the way of the profit motive. Anything that can be squeezed to make money should be squeezed, even if it’s a software program whose only real utility is that it can generate a video of a cowboy hamster riding a dragon. As one X user put it: “This is what the morons sacrifice the environment for. Stupid. Shit. Like. This.”

You May Also Like