When you consider the effect game engines will have on moviemaking, one term that might spring to mind is ‘flattening’.
Not having to wait for render farms to spit out shots to review (and change – or not, when you’ve finally run out of time or budget) means the production is ‘live’ from day one. It blurs the lines between pre, post and everything in between, flattening the whole workflow into one long creative input party.
Realtime VFX rendering in game engines actually seems like the perfect way to design and build a CGI world for cinema screens. Where a single frame in a movie has traditionally taken hours or days of data crunching time, games have to do it all in milliseconds as a player navigates and looks around an entire virtual world, doing it constantly with every frame refresh.
Deploying a game engine-defined universe completely upends the VFX workflow where a character designer hands a shot off to a lighting artist, who then hands it off to a colour grader, etc. In fact it flattens the whole pipeline so much it’s not unusual for shots constructed for previz to find their way into final frames.
It’s actually a bit like the old way of doing things where you build a set, dress and design characters and then figure out the best way to light it and shoot the action in it. The only difference is it’s all virtual.
Game engines have the potential to do away with the notion of upstream/downstream VFX workflows completely. Think of working with far-flung colleagues in a Google Docs or Office 365 file, everyone’s work assimilated and saved every time a change is made.
Building the environment of your movie – maybe just the set of just one scene – lets everyone dive in and do their part concurrently, all of it rendered out in real-time just like playing a video game. It solves what Tom Box, co-founder and general manager of London animation studio Blue Zoo, calls ‘a big conveyor belt of stages where any stoppage causes a bottleneck further down the line.’
“When you’re looking at a preview monitor or in some cases seeing things on set in real-time, you’re able to make all the creative judgments at the same time,” adds Rob Bredow, Executive Creative Director and Head of Industrial Light and Magic (ILM).
Systems with benefits
A good example of having the real environment available to everyone comes from the Millennium Falcon flight sequences ILM built for Solo: A Star Wars Story. A rear projection movie screen was set up in front of the cockpit set so the actors were actually looking at the full resolution render of the jump to hyperspace, affecting and heightening their performances (https://www.youtube.com/watch?v=vLluW2VqHgA).
The advantage was what Bredow calls ‘better representation earlier in the creative process.’ If the Solo actors had been looking at a green screen for ILM to paint out and insert the VFX sequence later it would have been fine (and it’s the way we’ve been doing VFX for 30 years) but having the final render right there changed and even influenced the shots themselves.
For example, instead of cutting from the actors to the iconic hyperspace starfield from over their heads like every other Star Wars has done, the camera panned slowly away from space outside onto actor Alden Ehrenreich’s face. “You could actually see the reflection of hyperspace in [his] eye,” Bredow says, “you get a different response when you have more elements available.”
Extrapolate that further and you can surround the actors not with green screens but projections of the final render of the world around them as they work. As Kim Libreri, CTO of Unreal Engine publisher Epic Games puts it; “Instead of animating shots in a car chase you can drive a virtual car around a scene, filming the action just like you would in the real world but iterating and reshooting it as many times as you need.”
As you can imagine, it stands to make the whole process quicker and cheaper, but to most creatives it’s about the potential for better storytelling, Box saying he’s seen productions that might see eight versions of a shot per day go up to 100 using realtime technologies.
Libreri agrees, saying it will simply make moviemaking ‘better’ rather than just cheaper – especially with the capability to integrate technology like performance capture and VR exploration directly into the Unreal Engine.
But Isabelle Riva, Head of Made with game engine publisher Unity, has also seen quantifiable results from realtime VFX, saying short films worked on using Unity have seen gains of up to 50 percent in time spent.
Then there’s the data portability. Make a hit superhero movie these days and a lot of sequels and licensed products from the video game tie-in to the Netflix series are a certainty.
Spend a fortune on every project engaging a new crop of VFX vendors whose only reference material is the finished movie if you like, or just export and share your game engine data containing the landscapes, weather behaviour, populations of characters and every other detail to the relevant company and save them (and your client) a bundle.
For a company like ILM, sitting on 40 years of materials and data about Star Wars worlds and characters, such a system is critical – and the company has its own framework to maintain and share it.
But even at a studio like Blue Zoo there’s time and effort to be saved. Box talks about several TV projects the company has worked on that have attendant game titles. The workflow calls for two character rigs suited to each output, the original animation simply exported and mapped to each one. In a multi-platform world, Box and his team can find ‘massive’ efficiencies.
Beyond that, the benefits are nearly endless. Sketch out dangerous stunt sequences (stuntviz) to make sure they match what the director wants before committing resources to setting up the sequence – 2015’s Mad Mad: Fury Road used the process extensively (https://www.unrealengine.com/en-US/blog/virtual-production-lights-cameras-action), let a director walk through the fully rendered world using VR to find the shots he/she wants just like a gamer, and much more.
Good enough for…
The big question mark looming over all this is whether game engines produce high enough visual quality for cinema. Despite the jaw-dropping graphics in big name videogames, they’re still not quite as photoreal as cinema VFX.
Except that (depending what you read) they are. Epic Games claims Unreal Engine can not only match what it calls the ‘the right visual fidelity to match the aesthetic style of your creative vision’, it lets you capture footage in formats right up to 8K.
And plenty of VFX companies have dipped their toes into the game engine waters in feature films already. Back in Rogue One: A Star Wars Story, as many as 12 shots of the heroic reprogrammed droid K2SO (played in motion capture by Alan Tudyk) were done using real-time VFX and composited into scenes after.
Riva points to a bevy of award-winning short films made in the Unity engine, and a story on production technology website Redshark as long ago as 2014 said we could ‘nearly’ make movies with game engines (https://www.redsharknews.com/technology/item/1638-with-unreal-engine-4-0,-we-re-closing-in-on-making-films-with-video-games-engines).
Tom Box says we’re now at that crossroads. “The technology is matching what you traditionally take hours to render for the first time because of the increase in what the graphics cards can do,” he says.
There might be a few final frontiers like realistic humans and water effects, but it’s all coming to a very exciting head this year when the 2019 Unreal Engine release will include real-time ray tracing, something Box calls the ‘holy grail’ and which new generation GPUs will make mincemeat of. Unreal uploaded the amusing short film Reflections last March to showcase what would soon be possible (https://www.youtube.com/watch?time_continue=59&v=J3ue35ago3Y).
But it’s also true that as technology improves, we ask more of it. Bredow says the limitations today are one of thing things he’s been concerned with for about five years. “Take simple surface shading, which we can do very well in the GPU today, he says, “the inner reflections and complicated shadowing we expect in a photoreal modern feature film are still pretty challenging.”
Not if, when
So when does that mean we might see the world’s first global blockbuster made completely with real-time VFX? Bredow doesn’t think it’s years away anymore (but he admits he’s been saying that for 10 years), and Libreri says we’ll see ‘tremendous’ adoption over the next five years.
Inasmuch as the industry as a whole can have collective intent, at the moment it’s just to let the technology slowly catch up to and surpass existing and established workflows as people see the benefits.
But whenever it happens, games engines are set to enjoy a second life in a market that might meet or eclipse the influence they’ve had so far in our game consoles. As Riva says, “From a publishing standpoint films and games are apples and oranges, but from an authoring and technical perspective they’ve been converging for a long time. In film, we want to revolutionise the creative process by making a complex production chain simple.”