Clicky
Artificial intelligence

AI tools like Midjourney could change the way movies are made

Images generated by the AI ​​platform Midjourney.
Artwork: Johnny Weiss/Midjourney

There’s a new Knives out movie on Netflix, and I still haven’t seen a few of this season’s award contenders. But the movie I’d most like to see right now is Invasion of squid from the depths. It’s a sci-fi thriller directed by John Carpenter about a team of scientists led by Sigourney Weaver who discover an alien cephalopod and then die one by one to its tentacles. The production design was inspired by Extraterrestrial and The thing; there are handmade FX creatures and lots of gore; Wilford Brimley has a cameo. Unfortunately, I can’t see this movie, and neither can you, because it doesn’t exist.

For now, Squid Invasion is only one concept art portfolio conjured by a redditor using Mid Road, an artificial intelligence tool that creates images from human-provided text prompts. Midjourney was released in public beta over the summer, and for months it spat out mostly visual gibberish. “I was trying to do a picture of Joe Rogan fighting a chimpanzee, and he just looked like nightmare fuel“, says the Reddit user, Too manly Snail, real name Johnny Weiss. Then, in November, the software was upgraded to version four. He began translating complicated suggestions effortlessly (“DVD screen grab, John Carpenter 80s horror movie, alien squid attacking a horrified Sigourney Weaver, blood everywhere, extra wide shot, outstanding cinematography, 16 mm”) into imaginary still images that look like good enough to be real. Some of them look better than anything else in Hollywood’s current lineup: weirder, more vividly composed, seemingly less computer-generated, even if they’re totally computer generated.

Soon, Hollywood could be in direct competition with generative artificial intelligence tools, which, unlike self-driving cars or other long-promised technologies that never quite arrive, are already here and improving rapidly. Meta and Google announced software that converts text prompts into short videos; another tool, Phenaki, can do entire scenes. None of these video generators have been made public yet, but the company DID offers an AI app that can make people blink and read in still photos from a script, and some are using it to animated characters created by Midjourney. “In the next few years,” says Matthew Kershaw, vice president of marketing and growth at D-ID, “we could easily see a major movie made almost entirely using AI.” One day, instead of scouring our Rokus for something to watch, we could give our own entertainment the green light by offering loglines to algorithms capable of creating feature films with sophisticated plots, blockbuster effects and human actors from all eras.

One obstacle to this future is that fancy user prompts are no substitute for good scripts. Someone (or something) has to tell the video generators what to generate for two hours. But progress is also being made on this front, as it turns out that ChatGPT — the new AI chatbot able to write coded, college essaysand instructional rap songs on how to change your motor oil – is also an aspiring screenwriter.

With Weiss’s permission, I asked ChatGPT to develop a plot for Squid invasion. I described the concept images and told him to create a plan for the film, which I’ll summarize: In a remote research lab in the ocean, scientists discover a species of alien squid, which are hyper-intelligent and can regenerate their bodies after injury. The squids escape from their containment tanks and kill several researchers. The humans fight back with guns and other weapons, but that only makes the squids angrier. The scientists destroy the lab with a reactor explosion that they hope will also kill the squids. The film ends with the survivors celebrating their narrow escape – and mourning their colleagues.

It may not pack in many narrative surprises or subvert genre conventions, but it does imply that ChatGPT understands the basic logic of the story in a way that eludes many humans. He even, at my request, suggested a decent ending: another alien race contacts the survivors and reveals that the squids were a peaceful and misunderstood species.

What ChatGPT can’t do yet is write a real script. The software that powers most current AI language generators can process text of only 1,500 words or less, making it difficult to produce coherent works that are longer. But after many failed attempts, I asked ChatGPT to write some of the Squid Invasionthe first scene.

Dr. Samantha Carter

These calamari are amazing.

Dr. James Jones

Yeah, they’re definitely something. But we have to be careful. These deep sea creatures can be dangerous.

Dr. Mike Smith

I agree. We need to study them carefully and make sure they don’t pose a threat.

Dr. Carter

Oh no! The squids attack!

Dr. Jones

Take the flamethrower.

These lines are bad. But not so bad that I can’t imagine them being delivered in a perfectly enjoyable Gerard Butler movie. AI may never be Robert Towne, but with next-gen language robots due next year, the authors of black adam should be nervous.

Some have argued that AI tools aren’t as smart as they seem, that they are incapable of having an original thought and that they can only repeat their training material. This can hinder them in some areas. But in Hollywood, superficial riffs on pre-existing intellectual property are a valuable and lucrative skill. Some of the most acclaimed films of 2022, including Top Gun: Maverick and Elvis, have the hermetically nostalgic tinge of AI creations.

A few filmmakers have already adopted the technology for certain applications. Director Scott Mann used machine learning in his 2022 thriller Fall, modify the actors’ mouths to eliminate swearing and avoid an R rating. It was used the next year Indiana Jones and the Dial of Fate To do Harrison Ford, 80, looks 45. South Park creators Trey Parker and Matt Stone recently landed a $20 million investment for their new start-up, Deep Voodoo, an entertainment studio that will provide low-cost deep-fake visual effects. And for James Cameron Avatar: the way of water, FX studio Weta deployed AI to give Na’vi characters realistic facial muscles that move in concert. “In previous systems, if we wanted to change a character’s smile, we had to go in and move all the pieces around, and that was a lot of work to keep it from looking rubbery,” says Joe Letteri, Senior Effects Supervisor Weta visuals. “It brought us to a natural place much earlier.” Letteri does not expect the AI ​​to generate Avatar movies by itself, at least not soon: “We had 1,600 VFX artists working on this movie and another 1,600 live-action people. We worked there for five years. You won’t get that from a hookup line.

But Hollywood agencies and law firms are preparing for a future in which clients like Weaver might be unwittingly thrown into an editor’s fever dream.. “These tools are exciting, but what’s most important to us is that the companies behind them respect talent and gain consent for names, images, and likenesses,” says Joanna Popper, Metaverse Director of CAA. “We want to protect creators so they have the ability to monetize their work and images and others can’t exploit them.”

Non-consenting artist names could be banned as user prompts by AI generators. But that wouldn’t change the fact that many tools have already been taught by the work of these artists. The reason Squid Invasion is able to nail the sci-fi aesthetic of the late 70s to early 80s because Midjourney’s training data likely includes stills from real movies from that era, among millions of other images protected by copyright. “We’re talking about software that learns from content, but doesn’t necessarily present the content it learned from,” said Jeffrey Neuburger, intellectual property attorney at Proskauer Rose LLP. “So who owns the copyright for the work they create? It raises issues of fair use and also publicity rights. It’s one of those situations where the law is going to have to catch up with new technologies. .

In other words, we need to study these tools carefully and make sure they don’t pose a threat. Take the flamethrower.


See everything



//platform.twitter.com/widgets.jshttps://platform.instagram.com/en_US/embeds.js

Leave a Reply