Avatar: The Way of Water – How gaming tech helped bring an Oscar nominee to life and could change filmmaking forever | Tech News

From Lara Croft’s Indiana Jones-esque adventures to the increasingly Pixar-quality cartoon visuals of Super Mario, video games have long looked to Hollywood for inspiration.

But recent years have shown that the relationship is becoming increasingly transactional.

Although these days you don’t have to look very far to find a movie or TV series based on a popular video game (the last of us and Sonic the Hedgehog There are only two, and Mario himself will be in theaters soon), and it goes a lot deeper than you might think.

“These worlds have been converging for a decade,” said Allan Poore, senior vice president of Unity, a video game development platform that’s increasingly turning to movies.

“In most cases, the core principles are actually the same.”

In fact, modern video games look so good that the technology behind them has completely changed the way blockbusters are made — including the biggest of them all.

Avatar: The Way of Water Easily the highest-grossing film of 2022 — and fittingly so, since it’s the sequel to the highest-grossing film of all time.

James Cameron’s latest blockbuster is up for best picture Oscars on Sunday – Success in technical categories like visual effects seems certain.

Avatar: The Way of Water.Photo: Twentieth Century Studios
Avatar: The Way of Water.Photo: Twentieth Century Studios
“Avatar” won best picture and numerous technical awards.Image: 20th Century Studios

The Technology Behind Avatar

Many of the tools used to bring the way of water to life come from Unity’s Weta Digital division.

Unity acquires Weta’s technology assets, new Zealand-based visual effects company by lord of the rings Director Peter Jackson, to be acquired in 2021 for about $1.6 billion (he still owns a separate company called WetaFX, a more traditional visual effects company — somewhat confusingly — that also worked on Avatar ” production).

But what Unity’s deal did do was bring a team of talented engineers under the umbrella of a company known for its accessible video game engines. Think of a game engine as a recipe package – it will contain everything you need to make a game. Some are designed to help build specific types of games — like shooters or sports games — while others are more general.

Unity has been used for everything from indie games to Call of Duty and Pokemon franchises.

Jackson said the fusion of expertise, dubbed Weta Digital, will be a “game changer” for creators.

Video games succeed because the worlds the player explores are rendered in real time. That’s because games have different outcomes based on player actions — it’s not set in stone like movies or TV. Just think of the scene in The Wrong Trousers where Gromit builds the rails as he moves along them, and you can see this.

This is very different from the way traditional movies approach visual effects, where the rendering all happens in post-production – which is why you’ll see behind-the-scenes shots of actors standing in big green rooms or talking to tennis balls or sticks at the end. All the computer magic is done after the fact.

Avatar: The Way of Water.Photo: Twentieth Century Studios
James Cameron on the set of “Avatar,” which featured underwater motion capture.Image: 20th Century Studios
Avatar: The Way of Water.Photo: Twentieth Century Studios

“How do you speed up filmmaking?”

While Ways of Water still relies heavily on these technologies, part of the production was powered by new real-time technology, letting Cameron and his cast and crew map the finished product as they worked on set.

“How do you speed up filmmaking? You do that by showing artists and directors what that frame looks like as quickly as possible,” said Poore, who worked on the hit animated films “Ratatouille,” “The Incredibles.” 2, and Coco during his time at Pixar.

“Directors will use a screen that actually shows the live components so they can see what the scene and surroundings will look like while shooting.

“Hopefully they will help make filmmaking smoother, easier and faster.”

With Avatar 3 less than two years away, rather than another 13-year gap between the first two films, that assessment is likely to be correct.

A galaxy far, far away…

Competitors to Unity are also looking to tap into filmmaking with lifelike real-time visuals, and in some cases go even further.

The Mandalorian, popular star wars The series, which returns for a third season this month, uses an immersive soundstage called The Volume to place actors in any fantasy scenario a writer can dream up.

read more:
This Is the New Oscar Crisis Team
’94 Winners Tell Us About the Oscars

Disney+'s The Mandalorian. Figure: Disney+
Obi-Wan Kenobi and The Mandalorian used gaming technology in their virtual sets.Image: Lucasfilm

Rather than relying entirely on green screens to see effects added during post-production, The Volume features a massive wall of screens that display digital environments made using Epic’s Unreal game engine (which powers the popular shooter Fortnite) in real time.

This means actors know where their characters should be and can make changes on the fly.

Two recent comic book movies also use it – last year’s batman and last month’s ant-man trilogy.

No matter where you get your podcast, you can subscribe to Backstage with one click

Star Wars actor Ewan McGregor on The Volume during he returned to the team last yearAnd praised it as a transformative influence compared with the films he made 20 years ago.

“There are so many blue screens and green screens that it’s hard to believe when there’s nothing,” he said. “We are here [on Obi-Wan Kenobi] In this stunning scene, if you’re shooting in the desert, everywhere you look is desert, and if you’re flying in space, the stars fly by you. so cool. “

read more:
at the big oscar preview party
what it’s like to be nominated for an oscar

hello! Ewan McGregor returns as the titular Obi-Wan Kenobi.Image: Lucasfilm
Ewan McGregor praised the impact of technology on filming.Image: Lucasfilm

“It’s a huge change”

While Poore doesn’t see the need for traditional digital effects techniques disappearing anytime soon, the idea of ​​a “virtual production space” that can generate visual effects on the fly will only grow.

At the UK’s National Film and Television School, there’s an entire course dedicated to this.

Ian Murphy, head of the school’s Masters in Visual Effects, said: “The main change that’s really exciting is that it takes post-production, firmly at the end of the process, to get us involved from the start.

“The VFX guys are very savvy, but it pushed them to have conversations with the production designers and cinematographers on set—that was a huge change.

Students at the National Academy of Film and Television experience hands-on virtual production. Figure: NFTS
National Film and Television Academy students experience virtual production first-hand
Students at the National Academy of Film and Television get hands-on with Volume-style virtual production. Image: NFTS

“If you’re shooting on green screen, you’re going to have really weird, blurry dialogue. The idea of ​​the technology is that it changes pretty quickly. And they might not be finished photos, there’s still visual effects work to do, but the process Something is a kind of blueprint that gets you into full production.

“Now the image you get from the game engine … the trajectory is definitely all moving towards it, and it ends up being the actual image that people see in the movie theater.”

We’ve definitely come a long way from Pong.

You can watch the Oscars in the UK on Sunday 12 March at 11pm, exclusively on Sky News and Sky Showcase. Plus, get all the intel from our Oscars special backstage podcast starting Monday morning, no matter where you get your podcasts.

Source link