Virtual Production (VP) is ramping up globally. What do you think VP involves?
Written by Dr Jodi Nelson-Tabor
Emerging technologies and digital disruption has impacted the film industry in ways that are pushing the boundaries of creativity and technical innovation. But is it the magic bullet the media makes it out to be? Or is it merely the next evolution of filmmaking? Against the backdrop of a global pandemic, which brought the film industry (like many industries) to a complete halt, many studios found ways to overcome the challenges that covid-19 brought such as, social distancing and restrictions on travel. Studio films such as Disney’s The Lion King and Disney+ The Mandalorian have both revolutionised Virtual Production techniques and brought its emergence to the forefront of industry change. But what is Virtual Production exactly?
While there are many definitions of Virtual Production (VP), Unreal Engine (Unity), which plays a major role in the expansion and integration of its software into these new filmmaking approaches, states that it is a “broad term referring to a spectrum of computer-aided production and visualisation filmmaking methods”. Weta Digital says it is “where the physical and digital worlds meet” and MPC (Moving Picture Company) adds more technical detail stating that “VP combines virtual and augmented reality with CGI and game-engine technologies to enable production crews to see their scenes unfold as they are composed and captured on set”.1
In a recent Future of Film Report (2021), it defined VP as “a spectrum of computer-aided production and visualisation filmmaking tools and methods”. Though there is currently no one standardised agreed definition of VP…they loosely define it as “bridging the gap between virtual and physical filmmaking. The principal of this is in some respects, an evolution of ‘rear screen’ projection which has been used in filmmaking since the 1940s. However, using Real Time Game Engine (RTGE) technology (such as, Unreal Engine), these backdrops are rendered as 3D digital environments that can be manipulated by the filmmaking team in real time”.2
To explore this emerging technology, a research project entitled “How to Be Good: A Case Study for Virtual Production Workflows” was developed in the Spring 2021 in the Stockwell Street Studios at the University of Greenwich, School of Design. In collaboration with the funder Storyfutures Academy and commercial hardware partner Mo-Sys Labs, a micro-short VP film project was undertaken by myself and colleagues across the Film & TV Department in the School of Design, Games Department in the School of Computing and Mathematical Sciences (CMS) and the School of Humanities (Department of Drama, Theatre & Performance Research Group) at the University of Greenwich to develop and experience Virtual Production as a Hybrid-Green screen production. More importantly, the case study would allow us to understand the production workflows that enable a VP to be constructed and executed — as there are many differences to traditional filmmaking that merit being explored.
While there are many different types of Virtual Production (Hybrid, LED Wall, Real-World/Live-Action Capture Hybrid and Full Animation), it was important to understand that each of these VP scenarios require differing levels of expertise, knowledge, resources, access and understanding. Beyond the immense learning curve in the technology and its application is perhaps understanding how the filmmaking process itself, which has been ingrained in practice, in pedagogy and practice-based training for many years (known as a waterfall process — or linear workflow) has changed. VP workflows however, are made using iteration. Iteration is the act of refining a process over successive attempts to arrive at a desired result. Meaning, while in traditional production, whereby you start with the script, usually as a road map, or blue print that then gets further developed succinctly by other creative departments in subsequent order to prepare it for production — VP on the other hand, uses an iterative process whereby all departments, even post-production are involved in the development of the project during the pre-production phase. The goal is to develop the film’s final pixel — meaning the final output of the film during pre-production so it can be executed and output during production — not during post-production using VFX techniques (though this still happens). Unreal Engine describes virtual production like parallel processing removing the barriers between live production and visual effects, so that they can occur simultaneously instead of inline.
And besides iterative workflows being major changes in filmmaking execution, there are many benefits as well as challenges to VP, as we discovered in our project, which are important to share here.
Probably the most important take-away is understanding that when shooting a VP production 80% of the film is made in pre-production using this iteration workflow. However, VP is not a magic bullet and applies to certain genres and levels of production. Very few will have the Disney-level budget to scale things up like the LED wall or fully animated films of The Mandalorian or The Lion King. But there are smaller and more conventional setups — such as the green screen Hybrid model we used and smaller LED set ups that can deliver the same level of quality on smaller stages.
A few key benefits4 of note are as follows:
- Natural Scene Lighting — when using LED screens, the ambient light means the correct light is created automatically. Green screens require certain lighting in order for them to work. But note that subject’s distance in LED Wall shooting, as well as moiré effect and parallax issues can still be problematic in both approaches.
- Complete Design Freedom — using LED volume means there’s no need to compromise on design for elements like hairstyles, costumes or props. Working in the green screen environment, we had to account for green spill and pulling the key.
- Full environmental control — Photogrammetry and 3D scanning enable the virtualisation of real-world sets and environments to create whatever scene the filmmaker wants. Certainly using Unreal Engine (via the Marketplace) as the backdrop of the film’s digital environment, gave us full control of our locations, even though we shot inside a studio and was more cost effective than shooting on location and creating it from expensive production design.
- Minimising costs and avoiding budget blow-outs — using VP techniques can help producers budget with greater clarity. In the past unforeseen elements such as weather, unavailable locations or even green screen effect failing to work as intended, will have made a project exceed its planned budget. Even in our project, there were unforeseen costs that in retrospect, need further scrutiny in the planning phases. But the larger costs, like reshoots, or lots of post-production rendering and replicating outdoor landscapes were avoided through the use of these new technologies and workflows.
- Seeing the finished project in real-time — by combining LED, camera tracking technology and real-time render engines, directors and actors are able to see what they’re creating playing back in real-time, adjusting as they go. This was certainly a benefit, even shooting on green screen, as the real-time playback was instrumental as a feedback loop for the entire crew.
These benefits emphasise the growing value and importance of Virtual Production particularly in marketplace, money and time. Dominating the media, and with Covid-19 acting as an accelerant, it has brought technology and creativity in the film industry into a disruptive, new, emerging era of filmmaking. Tim Webber5, Chief Creative Officer at Framestore in London says that VP is changing the creative process of storytelling and “re-engineering storytelling as we know it”. There are other factors of critical importance6 to adopting VP such as the growth of remote, collaborative workflows in post-production, planning in the digital arena, powerful motors of flexibility and efficiency (effectively you can do more cheaper and quicker and indeed better) once you have committed to change the production pipeline to a virtual or semi-virtual one, and the next generation of software that will enable that process to be even more instinctive and intuitive. Other positives include a reduced environmental impact and the removal of geographical restrictions to talent.
There are challenges ahead however, both in technical and creative fields to address the major skills gaps. With the proliferation of new content in the pipeline, after the ‘Covid drought’, there is an increased demand for these new skills as teams begin to understand the benefits of virtual production. Many individual companies and academic institutions (like University of Greenwich) are responding accordingly. Epic Games, ScreenSkills, Storyfutures Academy, National Film and Television School and BFI are just a few organisations leading the way to bring new skills-based training programs to the forefront. In a recent roundtable with DCMS (The Department for Digital, Culture, Media & Sport), several global organisations across the filmmaking spectrum agreed to four key areas in the virtual production pipeline that needed to be addressed and identified: pre-visualisation and virtual scouting; building virtual production studios to include the likes of LED screens, camera encoding and motion capture; running virtual production stages; and creating the digital content to put onto the on-set LED walls.
Over the next few weeks, a series of articles will discuss the various iteration processes of our research project, through the various creative departments. As each department engages with the virtual production technology and creativity in different ways, it’s important to deconstruct our VP project workflows to better understand its iteration, its interrogation and practice-led approach within the green screen hybrid environment, and how this new technology impacts creativity and skills-based learning for the next generation of filmmakers currently entering the market.
A team of filmmakers and academics at the University of Greenwich have created a micro-short film entitled, How To Be Good, in collaboration with industry leaders at Storyfutures Academy and Mo-Sys engineering to explore and document workflows in virtual production. In this first article of a series, principle investigator Dr Jodi Nelson-Tabor discusses what virtual production means in different contexts and how producing How To Be Good sheds an important light on how VP can be a managed and harnessed to create films that would otherwise be cost prohibitive and complex to shoot.
Follow the series/thread at #H2BG on Twitter @GREdesignSchool.
5 RTC 2021 Real Time Conference