What does the role of the Director of VP entail?

Dr. Jodi Nelson-Tabor
9 min readJul 5, 2021

--

Written by Dr Jodi Nelson-Tabor & Peter Bathurst

Writer/Producer/Co-Director Jodi Nelson-Tabor directs actor Grace Enever on the set of How to Be Good

A Director’s role is to set the artistic direction of the production and guide their Heads of Department (HoDs) on how to achieve it. Alongside deciding on shots and angles, directors oversee casting, set design, and musical score. Viewing the role of the director through the traditional lens, their role is to:

  • Interpret scripts
  • Set the tone of film
  • Work with department heads
  • Work with casting directors to find talent
  • Direct actors and the camera
  • Work with editors to assemble the film
  • Work with sound and music departments

In Virtual Production (VP) this is no different. However, because the pipeline (or workflow) is an iterative (or agile) progression, not a linear waterfall process as in the traditional sense, there are many opportunities to collaborate and build creative efficiencies more effectively. There are advantages to working in VP where previously obstacles (before real-time technologies) prevented the director from having full control over visualisation and its outcome before reaching post-production.

As VP blurs the line between pre-production, production, and post, this non-linear, collaborative, and efficient process can be used to overcome a number of practical, creative, and budgetary constraints. With VP, the workflow moves from a “fix it in post” to a “solve it in prep” approach. It empowers the director to both experiment and visualise shots with immediate feedback when using real-time technologies.

“I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.”

— David McDonnell, Co-founder, Last Pixel

A key component of the director’s role in VP (whether hybrid-green screen or Volume LED) is to have the ability to quickly sequence shots with the background already in view. And, where the director can view playback in an on-set monitor to see a composited shot against green screen or see the virtual environment in Volume LED, allows for a much greater sense and feel for the shot holistically. This is much more efficient, rather than imagining what the shot is supposed to look like if only viewing a non-composited green screen shot.

Working with the editor in previs (pre-production) also affords the director the opportunity to envision the shot in the edit, as a sequence and close to final pixel. Additionally, seeing the subjects in the virtual environment allows the director and DoP to plan and envision the lighting on the subject within the environment; this is so adjustments can be made in the moment. Using game engines like Unreal and Unity, combined with high-power graphics cards, camera tracking, as well as VR and AR, directors are now able to create scenes across the physical and digital worlds.

Previs: The Director’s POV
On our micro-short film production ‘How to Be Good’ (H2BG) we worked in a hybrid-green screen environment using Unreal Engine (UE) assets from the marketplace. And even though these virtual assets were different ‘virtual locations’ than were written in the original script, the HoDs were able to shift the environment and production design elements to fit the narrative rather than design a new UE asset, which would have been cost prohibitive.

Snapshot of our storyboard demonstrating the detail of each shot, with the visuals created in Unreal Engine
Editor & VFX specialist, Walter Stabb and Co-Director/DoP Peter Bathurst watch shot 3 & 4 as the composited shot on-set in playback

Story & Visual Development — The Storyboards
Like a traditional art department, the virtual art department (VAD) is focused on shot design, layout, visual development, and creating production-ready digital assets to be seen on screen. The VAD works in close collaboration with the director, cinematographer, production designers, and traditional art departments in order to ensure the digital worlds integrate seamlessly with the vision of all the creative stakeholders.

This description, taken from The Virtual Production Field Guide compiled by Epic Games, helps to further explain the role of a VAD: “Think of the VAD as a transitional team bridging the work of a more traditional art department and a regular previs department via the advent of real-time animation.”

According to DNeg, today’s Virtual Production processes leverage advances in computing power, VFX expertise and game-engine technology to:

  • create environments that now interact with the live-action, allowing filmmakers to make real-time decisions and changes
  • create backgrounds which move in full perspective with respect to the camera’s position, updating in real-time as the camera moves
  • allow the production to move from one convincingly detailed and photo-real environment to another without leaving the stage
  • allow final shots to be captured in camera, without the need to replace the backgrounds in post-production
The Storyboards for How to Be Good production with shot details and visuals created in Unreal Engine

In effect, ‘virtual’ locations from all over the world can be brought onto the VP stage with compelling photo-realism through the use of real-time technology. Taking advantage of these technologies, H2BG utilised the virtual locations and visualisation tools in UE to build our storyboard mapping the story elements and breakdown from the script.

We utilised this visual board on-set to help the entire crew stay on track as well as have a reference point in which to pivot and make changes where necessary. All of the necessary information was included in each shot selection including; a screenshot of the UE shot, camera and lens selection, shot description, scene description, characters in the scene, and where necessary, lighting and other visual instructions. This was absolutely key to our success working as a creative and technical unit during production and allowed everyone to have access to information which informed their specific tasks and roles on-set.

Principal Photography — the Director’s Vision
In principle photography the VP process shares many similarities to conventional studio-based filmmaking. An easy way of understanding the differences is to consider the VP element (whether hybrid-green screen or Volume LED) as you would a real-life location and to prepare to interact with it in the same way. How will it (environments/locations) look and how will it behave? How can the actors interact with it (camera/environment) to make our story compelling? If it is a real-life castle for instance, then what props can we add and how will we light it to look real and best tell our story.

For example, watch Director, Haz Dullul’s short VP film ‘Percival’ to envision how bringing virtual locations and adding real-life props and set design in the foreground can achieve this visualisation.

Behind the scenes of Percival

https://www.youtube.com/watch?v=D42t3DUQcGg

In VP, a lot of lighting and set dressing has taken place during the design and build of the virtual world within Unreal Engine. As principal photography gets underway, the same attention is now placed on the foreground and mid-ground as it has been in the background. The better you can anticipate how actors will interact (or won’t) with the virtual world the better you can anticipate what you will physically need to ‘sell’ the idea of the space forward into the audience’s mind.

Main character, Lily, in the hybrid-green screen environment with real fencing
In the composite with Unreal Engine background asset

Real objects (i.e. props and sets) can help carry the sense of real visual depth and understanding in how to stage this for both the real camera and its virtual counterpart is key to the success of the capture.

Main character Tom, in his home in the hybrid-green screen environment
In the composite with Unreal Engine background asset (which will still needs VFX in post to adjust the key)

Much of this isn’t simply about the pursuit of ‘final pixel’ but rather — as with all VFX — it’s about understanding how the best results will be achieved in terms of quality and budget. As was always the case, the better prepared you are in previs(where 80% of the film is made) the better the results will be during principal photography.

There is opportunity to adapt and adjust the virtual world when shooting, but of course these things take time and the conversation is still about whether that time is best spent in principal or in post. There are now unparalleled opportunities for directors to experience and explore the CG environments in previs that no longer need to be delayed until post.

Spatial Issues in VP
For H2BG, the main issue was our studio space. One of the key elements of the story is Lily’s physical journey. As she walks through her post-apocalyptic environment, which was designed with scope and scale in mind, she can only really move as far as our studio allows.

Stockwell Street Studio at University of Greenwich

It is a relatively tight space and as we were shooting in a hybrid-green screen capture format, the lighting only allows the subject within a couple of meters of the green screen itself. The action needed takes place under our (12 foot) light box, and so the ability to move the camera with a walking subject was ultimately limited to a few seconds or a few paces.

The solution to this problem was to employ a travelator and move the virtual background with the static pacing of the actor. This was yet another example of how the foreground needs to marry with the background and how new solutions need to be found to overcome the virtual nature of what the camera is seeing.

Actors Connor and Grace shoot the final scene in the film which is captured in 2 separate takes and will be ‘stitched’ together in post

It’s interesting to note, that even within a Volume LED mode, where lighting a cove or green screen is not a consideration, the functional distance from the background is still only 2 metres before pixels become visible and seeing the floor is still a challenge — and so by extension is the horizon line.

Clip of the shot with main character, Lily walking on the travelator, with notes to the editor/VFX team for further render in UE background.

https://youtu.be/wgwhs_hp-ug

Ultimately, harmonising the real world with the virtual is key to the success of VP and on-set. Despite inevitable technical challenges, this is still a huge step forward from the VFX and capture approaches that have preceded it. As McDonnell observes above, this new technological leap heralds a potential return to the best practices of creative decision-making being made in advance of and during the shooting process.

A team of filmmakers and academics at the University of Greenwich have created a micro-short film entitled, How To Be Good, in collaboration with industry leaders at Storyfutures Academy and Mo-Sys engineering to explore and document workflows in virtual production. In this first article of a series, principle investigator Dr Jodi Nelson-Tabor discusses what virtual production means in different contexts and how producing How To Be Good sheds an important light on how VP can be a managed and harnessed to create films that would otherwise be cost prohibitive and complex to shoot.

To follow the series, click on the following: 1, 2, 3, 4, 5, 6, 7, 8, 9

Follow the series/thread at #H2BG on Twitter @GREdesignSchool.

--

--

Dr. Jodi Nelson-Tabor

Dr Jodi Nelson-Tabor is the Business Development and Training Manager for Final Pixel.