Written by Walter Stabb
The micro-short How To Be Good was conceived as an exploration into Virtual Production (VP) that could inform and develop an understanding of how academics, practitioners and students can engage with and harness this new film making technology in their projects. With finite resources successful student filmmakers aim for ‘feature’ aesthetic, filling gaps with enthusiasm, ingenuity and creativity. Within this context the project was devised as an experiment with the ambition of final-pixel capture on-set. This is based on an acknowledgement that the vast post-production resource deployed to smooth the edges on high-profile VP productions is not realistic for student projects yet their experience and understanding of VP seems an ever more crucial educational step.
Logistical, budgetary and pedagogical concerns meant that the decision was made to shoot How To Be Good as a hybrid virtual production, a process that uses “camera tracking to composite green screen cinematography with CG elements” (Kadner, p.16). The Virtual Production Field Guide provides a distinction between two common methods in this approach, Real-time hybrid virtual production and Post-produced hybrid virtual production. The Real-time approach is primarily associated with live-to-air composites and studio-broadcasts seen in news and entertainment formats rather than drama production. Post-produced hybrid virtual production typically treats the live composite as a proxy resolution reference to be used on-set and in the offline edit as a way of understanding how the final film will look at the end of the post-production process. The challenge was to see how much of our micro-short could be considered captured final-pixel and what we would learn about the process approaching the project this way. It also introduces the question, if the majority of shots were to demand compositing again what value does the hybrid virtual production approach bring above conventional green-screen shooting process?
My introduction to hybrid VP began with an expedition virtual scouting our filming location with other HoDs. As editor this provided the fairly unique opportunity to input into the Director and DoP’s discussions on storyboarding from within the location and to help to develop a storyboard resource that would be used on-set by various heads of department to navigate through the production phase.
Mo-Sys supported the shoot of How To Be Good through on-set tracking and rendering managed by two VP technicians with one taking primary responsibility for the real-time keying process using BlackMagic Design’s Ultimatte hardware. As editor on-set, in partnership with VFX artist Richard Oldfield, there was scope to analyse and input into the real-time keying process through discussion with the Mo-Sys team. This dialogue and the ability to review and manipulate the composite on-set synchronously with adjustments to lighting and production design helped to identify and resolve issues that in a conventional green-screen process would perhaps not have revealed themselves until later in the process, demanding the traditional ‘fix it in post’ response.
Using the Ultimatte allowed for some colour correction prior to creating the real-time composite. This preparatory work manipulating the luma and RGB levels through adjustment layers to balance the elements of the composite — the foreground camera feed and the Unreal background — was to facilitate correction and grading in post rather than to replace it. The result was composites that appeared well balanced and coherent ready for further colour work in post.
The relatively small studio size, lighting challenges and critically imperfections with green-screen backdrop meant that whilst best efforts were made to achieve a clean key the majority of the wide shots remained imperfect with the requirement that work would need to be done creating new composites in post-production. The hybrid VP process had identified these issues effectively on-set but production and schedule constraints meant that these issues were still considered best resolved with post-production clean-up.
The Mo-Sys VP technicians created 3 media formats for each take, a foreground camera feed, a matte and a real-time composite. Each stream was captured 1080p ProRes422 HQ and recorded to its own dedicated 500gb SATA SSD hard drive. Whilst recording at a 4k resolution was an option, under guidance from Mo-Sys a compromise between image quality and recording stability meant we chose to remain at Full HD.
Other articles reflecting on this micro-short identify the huge benefit that on-set monitoring and review of the real-time composite offer to different departments in the hybrid VP process. After offload and ingest of the media via SSD dock the real benefits of hybrid VP become apparent to the editor. The real-time composites created during shooting are immediately available as high-fidelity representations of the final shot. Therefore Jean Ciuntu (Assistant Editor) and I could begin to review the shots and edit scenes almost immediately with vastly more understanding of the material than would be possible working with the foreground green-screen shot alone. This allowed for quick feedback into the director and camera department, with the opportunity to capture pick-up shots and consider upcoming set-ups on a day-by-day basis confidently having reviewed real-time composite shots within edited sequences.
At this stage, given the complications achieving a clean key on-set as revealed in the real-time composites, it was apparent that we were now working in a Post-produced hybrid VP model where a considerable number of the shots would need to be processed in post-production to improve the quality of the green-screen composite process. Within the offline edit ‘Dailies’ sequences were created that featured the ‘location’ sound recording and three video tracks with the camera foreground, matte and real-time composite synchronised and stacked allowing for quick review of action across the different recordings and in preparation for the re-compositing and clean-up of shots once the offline edit had been picture locked. As an aside it is worth noting that these elements, whilst sharing a synchronised timecode were not exactly aligned. The camera foreground recording being six frames behind the composite, which resulted in some manual adjustment lining up to the clapperboard.
The offline edit progressed using the real-time composites, which were of a high enough standard to allow myself and Jodi Nelson-Tabor (Writer/Producer/Co-Director) to make creative decisions on performance and narrative without the conceptual struggle of working with green-screen only foreground footage in the selects or sequence. This was a huge advantage and supports the argument that when working with VP “knowledge is power, and for an editor, the closer a frame looks to its final form, the better.” (Kadner, p.55)
How To Be Good’s opening sequence was storyboarded as a VFX aerial shot providing a birds-eye view descent into the location. The initial plan was to build this with a variety of still and moving image assets using traditional composite techniques in Adobe After Effects. However in collaboration with Drew MacQuarrie (Unreal Engine Artist) we experimented with creating these opening shots using fully Unreal Engine generated aerial or crane shots that circled our asset from above and introduced our lead character in extreme wide. Working in dialogue with Drew to create these and the film’s other fully Unreal Engine generated shots was fascinating and demonstrated to me the huge potential there is in the synergy between games engine technology and film making practice. The flexibility and speed with which Drew was able to create and adjust shots in response to ideas and edits shared with him using the language and tools of filmmaking within his virtual environment was exciting in its creative potential for the edit.
Once picture lock was achieved handover from Adobe Premiere Pro into Adobe After Effects meant pre-compositions could be made that replicated the synchronised stack of media formats and work could be done addressing green spill or problems with the green-screen key. Conventional post-production techniques such as chroma keying, rotoscoping, masking and spill suppression were used to address issues that were not adequately resolved on-set and undermined the final pixel ambition. These shots were then supplied back to the offline edit, either with alpha channels or as full composites dependent on the fix that was required before the sequence was round tripped from Adobe Premiere Pro to DaVinci Resolve for the final grade.
How To Be Good offered a multitude of learning experiences and brought into focus the massive potential of Virtual Production for drama shorts and micro-shorts set in film worlds that are reliant on VFX for their creation. Our initial aim for final pixel capture through the hybrid virtual production model was ambitious and on reflection unachievable. 61 of our 71 shots were re-composited in some form during post-production to address issues with the image which were considered too problematic to remain in the final film. The provision of mattes recorded during the real-time compositing process made much of this work simpler, with perhaps the greater challenge coming in re-generating and syncing Unreal Engine backgrounds using the camera data captured on-set.
The number of reworked shots clearly conveys that Post-produced hybrid virtual production is the method that filmmakers and students will need to plan for in their short drama projects using these new technologies. There is a complexity and challenge to finishing in the virtual production method that perhaps undermines notions that with these new technologies the vast majority of the work has moved to earlier in the film production workflow. We relied on expert knowledge of conventional green-screen and post-production clean-up techniques to complete the film which also raises questions as to how this technology should be effectively incorporated into student lead projects. However for the editor and director searching for creative freedom and flexibility in the offline edit stage working immediately with high-fidelity reference composites hybrid virtual production technology offers capability well-beyond conventional green-screen process. This is a major benefit offering huge potential for filmmakers and storytellers embracing these new tools.
Kadner, Noah. “The Virtual Production Field Guide (v.1.3).” www.unrealengine.com , 2019, https://cdn2.unrealengine.com/vp-field-guide-v1-3-01-f0bce45b6319.pdf. Accessed 2021.
Walter Stabb is a Senior Lecturer in Post-Production Editing at the University of Greenwich. He is an editor, filmmaker and academic who joined the University of Greenwich in 2016, teaching on the Film and TV Production and Digital Filmmaking courses at undergraduate and postgraduate level.
He has worked on projects screened by the BBC, HBO America, global advertising agencies, cultural institutions and at festivals internationally.
In addition to his professional work as a freelance editor and filmmaker with Sweet Take Studio He holds a first class degree from the University of Nottingham and wrote an MRes thesis for the London Consortium on documentary film, trauma and animation. He is an ongoing Research Associate and former Head of Post-Production at The Derek Jarman Lab, School of Arts, Birkbeck, where he helped to create The Seasons in Quincy (2016) and Europe Endless (2019).
A team of filmmakers and academics at the University of Greenwich have created a micro-short film entitled, How To Be Good, in collaboration with industry leaders at Storyfutures Academy and Mo-Sys engineering to explore and document workflows in virtual production. In this first article of a series, principle investigator Dr Jodi Nelson-Tabor discusses what virtual production means in different contexts and how producing How To Be Good sheds an important light on how VP can be a managed and harnessed to create films that would otherwise be cost prohibitive and complex to shoot.
Follow the series/thread at #H2BG on Twitter @GREdesignSchool.