What does the role of the Producer working Virtual Production entail? Part II

Dr. Jodi Nelson-Tabor
12 min readJun 14, 2021

--

June 15, 2021

Written By Dr Lindsay Keith

On location at the Stockwell Street Studios at the University of Greenwich for principal photography on How to Be Good #virtualproduction

Earlier in Part 1 of this strand, we looked at the role of the Producer on a Virtual Production. The role of a Line Producer in film (and similar to the role played by Production Managers in broadcast) is also there to ensure that the logistics and operations at the heart of a production run smoothly — to oil the wheels of a production, keep the plates spinning (often many, many plates!) so that the production team can work together to create the vision for the film.

Part of this job is working creatively and supportively with writers, directors and producers to ensure that the production framework is optimal for the best output. When a project comes together there are many players with skin in the game — the writer has a script with a particular focus and the director works with this to bring it to life visually. If all projects were multi-million dollar productions where, for example, kit choices and locations were no object, this might be straightforward but the simple fact is that all productions (yes even those multi-million dollar ones) will have the budget pushed and pulled in different directions across different departments. Part of the role of the line producer is managing the elasticity of the budget so that different visions can align to make one creative whole.

I like to think of budgets like bungee cords. When you measure a bungee cord under no strain, it has a certain length to it, but its loose. As the financial load from different departments pulls on the cord, it lengthens and accommodates them (a bungee cord under tension is the safest way to secure a valuable package), but if you put too much strain on it, eventually, it snaps, and that’s no good to anybody.

In traditional filmmaking there is a maxim that what you remove from preparation (aka previz) and pre-production, you will only have to add in trying to ‘fix it in post-production’. The same is largely true for VP (Virtual Production). Although “final pixel” is seen as an ideal, that is, the concept that if you’re shooting VP then you don’t need any post-production, is still only an ideal, and may only ever be an ideal. For sure, you can create your environment and VFX in previz, but for now, there is still much to be done in post-production to smooth things out.

One of the most important discussions to have, as a line producer early on in production, is to define your parameters. What is desirable for the production? What is optimal for the production? And what is necessary for the production? Once you have those parameters, you have room to play with and can stretch and bend towards and away from different demands in the way that best serves the production. A quote from Erik Winquist, WETA’s digital VFX supervisor, comes to mind and is definitely worth remembering for virtual productions. He says “Don’t get too hung up on any one particular technique” — just because you CAN produce something virtually, doesn’t mean you SHOULD if its not the best for the film (or the budget).1

There is no point in hiring a massive LED studio to film a scene that you can film more cheaply and easily (and potentially with more authenticity) in real life using traditional methods.

It harks back to Jeff Goldblum’s most famous line in Jurassic Park; “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” Goldblum’s character Ian Malcolm is talking about ethical considerations, but for your virtual production, the consideration has to be about the budget; your literal bottom line. If you run out of budget, you run out of film, and the responsibility for that shouldn’t be underestimated.

Script breakdown for How to Be Good

The first process we went through, once we had a script and a budget, was to breakdown the script line by line. For us this meant tagging a script, saved in google drive to share amongst our remote teams, with a comment whenever it mentioned an asset that would be required. This is the same as for traditional production except that you’re counting digital assets as well. During this process you also figure out which items you will have as practical effects or props on set and which will be inserted as digital assets. Aside from fencing panels, our biggest prop was a massive door that was the entrance to the virtual railway wagon. The real prop, which was a studio flat was painted to match the virtual railway wagon and door, where our two characters would meet inside and for which they both had to pass through the opening.

It was important to the script that our protagonist, Lily, scratched her name on the door, but this would have been time-consuming and difficult to achieve virtually. All credit is due to the Head of the Art Department, Alison Cross, who’s wagon door, painted on a studio flat, matched the virtual wagon seamlessly.

It became clear early on that our budget would not stretch to hire a digital artist to create the original environment we desired, for example, and so we purchased an off-the-shelf ready made one from the Unreal Engine marketplace. The tagged script was then translated into a script breakdown stripboard, and then for me, into another spreadsheet which was my shopping list for production, which fed directly into the budget.

Extract from project budget

The script featured a fence as an important feature — in this case for authenticity we found a “digital fence” that accurately represented the real life fencing that is common on the perimeter of construction sites. The result of the blend between the real fencing and the digital was pretty seamless in the virtual environment. Going through this process, and with a kit wish list from the director, allowed me to balance back and forwards the budget we had. Securing the real fence panels to use as props, on loan from a nearby building site, helped the budget considerably!

An approximation to the bungee cord analogy is the producer’s triangle, which is the same for traditional filmmaking as it is for virtual production. Your parameters are the schedule, the budget and the scope, where the ‘scope’ is what you will shoot, how much you will shoot, and what you need to enable that shoot. As with traditional filmmaking, kit and crew will be a large part of your budget.

Producer’s Triangle

For the kind of VP we were doing in #H2BG, this was hybrid green screen shoot, which meant shooting actors in a green screen studio, but being able to see on a screen in real-time, how they looked in the virtual environment.

Hybrid Green Screen Studio
Playback on a monitor allowed for real-time review after each shot.

Differences between VP and Traditional Productions

Where the differences come in is in the additional kit required to support filming for a virtual environment. First you need to be able to track where the actors are in relation to the environment, so that when you transfer the footage into the virtual environment the environment moves in relation to the camera moves, in exactly the same way as it would if you were filming in a real environment. Take for example this shot from Citizen Kane. At the start of the shot the camera is further away and you can see a lot of the background. By the end of the shot it has moved closer in — in VP the games engine has to do this work, and it can only do so and simulate this background movement accurately, with lots of information about how the camera moves in real life.

In order to do this, the exact position of the camera needs to be tracked and fed into the games engine. There are several proprietary systems that enable this, but for our production, it was the StarTracker system by Mo-Sys Engineering.

I recall visiting Mo-sys about ten years ago when this hardware was in its infancy. Their prototype consisted of a webcam gaffer taped to the top of the camera pointing at the ceiling, on which they had stuck and array of hundreds of silver stickers in a random pattern. Of course the hardware today is much more sophisticated but the principle is the same.

Putting up the stickers (for the camera tracking)

The camera-tracking camera records the sticker pattern (which has been mapped into the games engine) on the ceiling as the camera moves. It can then triangulate (from the way the camera moves in relation to the stickers) exactly where the camera is in the studio, and feed that information into the games engine which can then make the virtual environment move correctly in response to where the actors are. Of course it also needs detailed information about what camera is being used, what lens is being used, its focal length etc. Its also not a perfect workflow system, as we discovered not long before shooting…

Mo-sys Star Tracker System

(The Mo-sys sticker array) took several hours to mount/stick these ‘stars’ in the studio — a day’s worth amount of time that needs to be protected and accounted for in the production schedule.

Another important key element in VP that is different than a traditional shoot is where a schedule is usually devised around locations. Rarely are films shot sequentially, because the cost of not shooting everything you need from one location in one go is simply an unnecessary extra cost to return to the same place. In VP filming, this ‘location change’ however, translates into lenses. Because the camera tracking system, while highly sophisticated, is not perfect, each individual lens, once fitted to the camera must be calibrated, by moving markers around the studio and “teaching” the system how this spatial information is mapped. Lenses, even expensive prime lenses, are not perfect, and without a highly detailed calibration, the limited information the games engine would receive would result in the virtual environment beginning to “drift” in relation to the actors.

What we had been unaware of until not long before the scheduled shoot (of only 2 days duration) was that lens calibration itself initially can take up to 3 to 5 hours (this was done with our two zoom lenses prior to the start of production). However, even on set, once calibration has been completed, it isn’t possible to swap out lenses once they have been calibrated, as each time you swap lenses, there still needs to be a recalibration/or correction to realign the tracking system (which can take up to 3 hours depending on lenses). The other key point is that we used zoom lenses — not prime lenses — for our entire shoot, which can take longer to calibrate than prime.

So, all of a sudden we found ourselves with a 2 days shoot, which needed to include a potential 15 hours down time to allow for calibration (i.e. 2 zoom lens calibration in previz, and correction calibration during principal photography). It also meant we couldn’t simply shoot part of a scene on one lens, and then swap over to another for a different shot or angle. This was quite a fly in the ointment for a short shoot as you can imagine!

At this point, therefore, we had to do some quick thinking and spinning more plates. We queried an additional camera (which we had) so that we could have a calibrated lens on each, but this would have taken a second camera tracking system (which we did not have). Therefore, we had to rejig our schedule and shooting order to account for the correction/swapping lenses, and reduce the number of lenses we wanted to use from 3 to 2 and allow for early access to the studio before the shoot to calibrate the first lens, and some late night working after Day 1 during previz to calibrate the second. For the director this was not ideal but it was the best compromise that we had available to us to deal with this unexpected intrusion into the shooting schedule and definitely a big lesson learned in shooting VP.

Camera tracking system aside, as the line producer I needed to understand exactly what other hardware we needed and how it would fit together to make sure on the day we’d have everything we needed. Broadly speaking, this diagram shows the elements required to shoot VP hybrid green screen the way we did.

In this case, the camera was a RED weapon with Arri prime lenses (however, we used 2 different zoom lenses), mounted on a Ronin rig with an Easyrig Vario 5 and the Mo-sys StarTracker system. The Keyer was the Blackmagic Ultimatte linked to 3 Blackmagic recorders, one recording the actors, one recording the key channel and one recording the virtual environment (which was an off-the-shelf environment purchased from a digital asset store), which played into the Blackmagic system from Unreal Engine.

Data Wrangling & Management

One of the final aspects not to be underestimated is hard drive memory for recording and data wrangling. As we were recording 3 different footage tracks we generated a lot of footage. For this micro short film, we procured a RAID drive as a master back-up storage system. Two portable tough hard drives for editing, but we also required 6 SATA drives for the Blackmagic recording system. So there was a significant expenditure on memory alone. One final note of caution: file naming discipline; It doesn’t matter HOW you name the files you will use and generate, but it DOES matter that they are all saved according to a strict file-naming convention. VP generates many many many digital files, and if you’re not disciplined about how they are named and the file and folder structure, you can come horribly unstuck, so don’t risk it and if in doubt, ask an editor!

Once the equipment was in place, the shoot proceeded much like any other traditional shoot — albeit with several new teams in place from the integration of the real-time engines. Being able to see the actors in the virtual environment on a screen in real-time was amazing, and was definitely a bonus as part of the creative process in VP. Even although we knew that there would be some VFX work to be done in post, we had a very clear idea of what the requirements were as we aimed for as close to final pixel as possible on set.

Dr Lindsay Keith is a Research Fellow at the University of Greenwich. BSc, PhD. She works in Public Engagement, with the focus on widening participation, and increasing diversity in Film/TV & STEAM. @smashfestUK

Next week, we will discuss the DoP/Cinematographer Role covering a wide array of tasks and skills needed, as well as the experience on the set.

A team of filmmakers and academics at the University of Greenwich have created a micro-short film entitled, ‘How To Be Good’, in collaboration with industry leaders at Storyfutures Academy and Mo-Sys engineering to explore and document workflows in virtual production. In the first article of the series, principle investigator Dr Jodi Nelson-Tabor discusses what virtual production means in different contexts and how producing ‘How To Be Good’ sheds an important light on how VP can be a managed and harnessed to create films that would otherwise be cost prohibitive and complex to shoot.

To follow the series, click on the following: 1, 2, 3, 4, 5, 6, 7, 8, 9

Follow the series/thread at #H2BG on Twitter @GREdesignSchool.

1 https://cdn2.unrealengine.com/vp-field-guide-v1-3-01-f0bce45b6319.pdf

--

--

Dr. Jodi Nelson-Tabor

Dr Jodi Nelson-Tabor is the Business Development and Training Manager for Final Pixel.