In-camera VFX method - LED wall - Final Pixel
January 31, 2023

In-Camera VFX | How Industry Experts are Using ICVFX

Version Control

In-camera VFX (ICVFX) and LED wall virtual production methods — sometimes also referred to as on-set virtual production — are being used to create some of the most cutting-edge media today. 

Global creative studio Final Pixel has been using in-camera VFX and LED walls since 2020. Read on to learn more about their execution of these virtual production methods from their CEO and Co-Founder, Michael McKenna. 

First, let’s cover some basics: 

Table of Contents

➡️ Attend Free Virtual Production Courses

What Is In-Camera VFX? 

In-camera VFX (ICVFX) refers to effects that are captured while filming, rather than being added later in post-production.  

In modern filmmaking, in-camera VFX usually refers to the process of rendering 3D environments on large LED walls and filming actors and props on a physical set in front of them. This method results in a photorealistic scene that can be viewed and filmed directly in-camera, without the need for green screen replacement later. 

In-camera effects have existed for a long time using mirrors, painted glass, forced perspective, and other techniques. Now, LED walls are revolutionizing virtual production.   

How Does In-Camera VFX with an LED Wall Work?

At a high level, virtual worlds are created by artists and designers prior to a shoot. These scenes can be based on real locations or completely fictional. Then, they’re rendered back in real-time on a large LED display that creates the backdrop to the set. The set itself can contain any number of physical and live elements.  

Some well-known shows and films that have used this type of production include Westworld, Game of Thrones, 1899, The Joker, The Batman, Bullet Train, and many more.  

Final Pixel crew in front of LED wall

Why Use In-Camera VFX with an LED Wall in Virtual Production? 

The primary advantage of using in-camera VFX and LED walls for virtual production is that the scene you capture on camera is very close to final. Traditional virtual production involves filming actors in front of a green screen and adding the background and visual effects to the scene in post-production.  

With LED walls, the background is displayed behind the actors and physical set. This allows the director and set crew to see the final scene in real-time, rather than having to rely on their imagination. The real-time capabilities this method offers makes it popular with content producers for the following reasons. 

More Creative Flexibility 

With game engines and in-camera VFX, your art team can edit the scene displayed on the LED wall in real-time. Backgrounds and locations can be tweaked to fit the director’s exact requirements. Don’t like that building where it is? Move it in a matter of moments. The weather isn’t quite right for the mood? Change it! Virtual worlds allow directors to have complete control. 

Location Options are Limitless 

No longer will set locations be limited by time, budget, permits, or reality. Shooting with a virtual world backdrop means location options are only limited by the creator’s imagination.  

From a single stage, you can produce scenes in any number of locations. This helps you maximize your on-screen talent’s time and ensure you stay on budget. It also helps reduce your travel budget! Plus, every hour is “golden hour” — you can get the perfect lighting from any angle, any time of day, all day. 

Enhanced Realism 

Using real-time rendering on-set combines the best of green screen and practical filming. You can be in any environment — even fictional ones — and still cast realistic lighting and reflections on the foreground. If the lighting on the LED wall changes, it shows up on the actors’ faces. This is essential when filming reflective surfaces like cars, suits of armor, or any scene with glass in it. Creating those reflections in post-production can be incredibly difficult, time-consuming, and expensive. 

You Get More from Your On-Screen Talent 

Visual effects and virtual worlds have been commonplace in film and television for decades. However, the need to capture the live performance first and then add in virtual aspects later led to one major problem: actors couldn’t see what they were interacting with. 

Working with LED walls has all but eliminated this problem. They allow actors to interact with the environment more organically, resulting in a more realistic and higher-quality scene. That means actors can nail the scene in fewer takes, saving teams money. 

Want to Learn More About In-Camera VFX? 

Become a pro at virtual production with free, expert-led training. Sign up for Perforce U College of Virtual Production.  

✨ START LEARNING ✨ 

Interview with Michael McKenna, CEO and Co-Founder of Final Pixel

After more than 15 years in the television industry, and in the midst of the pandemic, former BBC Studios executive Michael McKenna founded global creative studio Final Pixel. His team works with producers to create revolutionary work using virtual production technology like in-camera VFX and LED walls. Let’s dive into his thoughts on the current and future state of this tech, the possibilities it opens for content creators, and how to start taking advantage of it. 

Q: When Did You First Learn About Virtual Production? 

“The Mandalorian” included a number of behind-the-scenes videos on Disney+. I watched them in awe of the technology John Favreau and the team from Industrial Light & Magic put together. They had created a method of shooting that could place actors in any environment imaginable, who were lit by the environment itself using giant LED screens projecting a photorealistic world that was built with 3D modelling in Unreal Engine. As they moved the camera, the world moved in sync. You could look around corners into virtual worlds. It blew my mind. 

It was also blowing my brother’s (Chris McKenna, Creative Director of Final Pixel) mind in Los Angeles.  It came up on one of our weekly family zoom calls. We were both hooked.   

Q: What Was Your First Experience with Using In-Camera VFX (ICVFX)? 

Chris & I teamed up with Monica Hinden (Executive Producer, Final Pixel) and funded two demos in autumn of 2020, one in the UK and one in Los Angeles. We brought together a team from all over and partnered with some other local companies to have our first foray into virtual production. 

We tried to create realistic looking sets, which could be used as a replacement for location shoots. The results were staggering. We could immediately see the huge potential for this method of filmmaking. So much so, we formally launched Final Pixel shortly afterwards. 

We learned loads on these shoots, in particular the need for tight file management and version control. It was following this we began using Perforce Helix Core, which has dramatically improved our pipeline and efficiency working on Unreal projects. 

Q: Where Are Your Studios? 

Typically, we do our shoots on private stages which are set up specifically for our clients and not available to the public. These can be anywhere in the world. 

We can establish these wherever our clients need to be, and they’re often determined by the talent involved. Our core bases are in New York, Los Angeles, and London. We also occasionally partner with other emerging virtual production stages where the project requires it. 

Q: What Tools Do You Use For In-Camera VFX Production? 

Virtual production works by tracking the “real-world” camera in a studio. This tracking information is combined with the ‘virtual’ camera tool within Unreal Engine, live and in real-time. 

This virtual camera can be programmed to move in sync with the real-world camera with little noticeable lag. The result is that we can then project what the virtual camera is seeing onto a massive LED wall, basically a huge television.  

Real-time rendering of 3D models makes this possible. The typical VFX pipeline includes a large portion of time devoted to rendering images. With Unreal Engine, this happens right before your eyes. 

This doesn’t mean the end of physical set builds. On Final Pixel shoots, the art department is a critical component of the crew. To create a believable in-camera VFX effect, the foreground props need to blend seamlessly with the virtual world. 

In-camera VFX: Camera tracking with LED wall

Q: What’s the Coolest Project You Have Used In-Camera VFX In? 

Using in-camera VFX and virtual production has been a very cool and exciting process. Recently, we created a promo for ABC, where we got to recreate a physical set for “Dancing With the Stars” in Unreal Engine with controllable DMX lighting so that we could seamlessly combine stage lighting with the virtual set. 

Q: What Advice Would You Give to Content Creators Looking to Try In-Camera VFX? 

Don’t be afraid to ask for help, whether that is through toolchain vendors like Perforce or those in the business doing virtual production. We often consult with creators at an early stage and can help guide them as to whether virtual production is the right route to go down. There are limitations, and this technology is still in its infancy, so there are bugs to work through. However, it is improving at a rapid pace. 

With the proliferation of online streaming and publishing services, there is certainly a future that will see more choice and relevancy for audiences of all backgrounds. Perhaps this will be one of the more wholesome, positive impacts this awful pandemic will have released onto the world. 

In-Camera VFX with LED Walls: Tools You Need

To get started with in-camera VFX + LED walls, virtual production teams need some of the following essential hardware and software. 

LED Panels 

Large LED walls, or LED panels, are what you display your virtual set on. When choosing your LED wall, you need to consider the following factors: 

  • Latency, refresh rate and color accuracy. 

  • Pixel pitch (the distance between each LED on the panel) — which affects how close the camera can be to the screen. 

  • Power supplies and heat dissipation. 

  • Do you need a flat wall, two walls at 90 degrees, a curved wall, a ceiling panel, or portable panels? 

Camera Tracking 

You will need a camera tracking system to coordinate the movement of the physical cameras with the virtual environment. This is necessary to ensure the scene displayed on the LED screen moves realistically as the perspective on the camera changes.  

You will need a system that provides extremely low latency and highly accurate position/rotation data about the camera(s) being used. There are inside-out trackers and outside-in trackers, each with their own pros and cons. In addition to position, you also need a system to transmit camera and lens data like focal length, focal distance, zoom, and exposure. 

Genlock System 

A genlock (generator locking) system syncs the camera, rendering machines, and LED panels to the exact same frequency. The purpose is to avoid scan lines showing up in the background.  

Rendering Machines 

In order to render the millions of pixels on an LED volume in real-time, the rendering must be split across several computers, which are perfectly synchronized with each other. Epic's nDisplay plugin and Switchboard utility enable this shared workflow. It utilizes Perforce Helix Core to ensure that each machine in the cluster has the exact same version of the project. It also leverages Helix Core so that all of the nodes will be updated to the latest version simultaneously with the click of a button. 

Game Engine 

You need a game engine like Unreal Engine to render virtual backdrops onto an LED wall in real time. And real time rendering is important because as the actor moves, the backdrop (used in tandem with motion tracking cameras) can move with them seamlessly.   

Version Control 

Studios who adopt game engines for virtual production find themselves with many, massive digital files to manage; increasingly complex projects; and large, often dispersed teams. A version control system helps manage these projects and files. 

Version control as a process refers to tracking and managing changes to files over time. Version control systems automate this process. The best systems allow you to not only store every iteration of your digital files, but to control the flow of changes to those files and manage the team’s contributions. 

A powerful version control tool that can support large art assets is essential in virtual production projects, where multiple people are creating work to be combined into a final product. It is especially important that work is done correctly from the start when cost of iteration is relatively low (as compared to making changes to creative during filming when every minute costs much more). 

📘 Related Resource: Learn more about version control in virtual production and get branching best practices.

Helix Core: Version Control for In-Camera VFX

Helix Core is known in as the game development and media standard for version control. It is the only version control tool that has the performance required to manage the numerous large files associate with Unreal Engine and in-camera VFX. If you are looking to create assets for an LED wall, you need Helix Core to be able to capitalize on the benefits of real-time rendering.  

Perforce Federated Architecture is the secret behind lightning-fast delivery around the globe. It allows studios to create any number of on-premise and cloud replicas so that your essential data is already close by, so you have those large binary files where you need them without the WAN wait. Plus, Helix Core offers integrations with the tools digital creators are already using — 3ds Max, Maya, and more. 

See for yourself why so many studios choose Helix Core for LED wall & in-camera VFX. You can get free tools and deploy them your way. 

➡️ FREE VIRTUAL PRODUCTION TOOLS