Content Creation / Rendering

Q&A with Remedy Entertainment: Adopting USD into the Game Development Pipeline

Screenshot shows USD user interface editing a bridge with statues.

Mika Vehkala has worked in the game development industry for nearly three decades and contributed to projects such as Horizon Zero Dawn and The Walking Dead: No Man’s Land. Now, he’s director of technology at Remedy Entertainment, the studio that created Alan Wake and Control. Vehkala spoke with NVIDIA about embracing USD into its game development pipeline.

What is your professional background and current role?

I’ve been working in the gaming industry for some 28 years, creating for various platforms in numerous roles. Currently, I’m working as director of technology at Remedy Entertainment focusing on our Northlight engine development.

Why did Remedy Entertainment decide to adopt Universe Scene Description (USD) into their game development pipeline?

We adopted USD to streamline our content pipelines and have something that performs well with large amounts of data and can be easily extended. We saw that all our various asset concepts such as levels, prefabs, templates, or presets could be unified as they are all hierarchies of property containers. Finally, we wanted to have a bi-directional interoperation between our tools and third-party digital content creation tools (DCCs).

Previously, we started seeing issues while working on ever-growing data sets, bottlenecks with version control, workflow with larger content teams, and the limited capabilities of building more complex building blocks of content to share between games. After looking into various options, USD quickly became the most obvious choice as it ticked all the boxes and more.

Why USD over other file formats and what schemas or libraries did you integrate?

We consider USD as much more than a file format. It offers composition, hierarchy, comprehensive referencing, good performance, extensibility through codeless schemas, and various plug-ins.

We used the vanilla USD and extended it with our own codeless schemas and a few plug-ins. Our schemas are generated from a proprietary data definition language that we use to specify all our content data classes.

What challenges did you face during the process of integrating USD?

As with any large tech stack involving complex data pipelines, there were plenty of surprises.

On the workflow side, it took more effort than initially expected to adopt new content pipelines. Our tools are written with C#, engine runtime with C++, and third-party DCC plug-ins with Python. It took effort in setting USD up in such a way that we can build against multiple versions with all these content pipelines in mind.

We wanted the same functionality as we had previously and real-time communication between different processes that have access to the same stage. We expect there to be more challenges along the way as we learn how to author content efficiently and make use of all the powerful features.

Chart shows the communication flow and layer edits of sending diffs from the editor through the Scene API, using USD to modify the layer.
Figure 2. Layer edits (and commands) describe how we send diffs to synchronize one way or another

Has the change to USD affected how you create assets?

We are still in the early stages of adopting USD, but we are seeing that it will change how we author assets in the long run. Level data, geometry, materials, and so on, will have to be more modular to leverage all the powerful composition mechanisms.

Can you share any tips or lessons learned for other developers looking to adopt USD? Any recommended small steps to follow for starters?

Before you start, you should read up about USD. Familiarize yourself with the terminology and concepts, then play with it using existing DCCs that have integrations, or with one of the available USD composers such as NVIDIA Omniverse.

For a more in-depth development perspective, join and participate in the Academy Software Foundation’s USD working group Slack team.

What do you think should happen for a wider adoption of USD in the gaming industry?

Collaboration within the games industry would definitely help. For example, actively participating in the Academy Software Foundation USD working group for games is a start.

We’ve recently started participating and have already given a high-level tech presentation of our approach to integrate USD into our pipelines and published Book of USD. I believe that in the next few years, USD will become the basis for making and distributing content for games.

What was the reaction of artists, tech artists, art directors, and studio executives to the move to USD?

We would not have made the jump without full support from content teams and executives. Expectations are high and we’ve had good progress adopting it. However, we will know the final verdict only after shipping the first game with USD-based content pipelines.

Do you plan to continue using USD in future game development?

We are fully committed to using USD as a basis for our content pipelines for all our future games. We are planning to move as much of our content data into being fully defined as USD: geometry, animations, level data, gameplay data, and so on.

For more information about Remedy, see Remedy Entertainment. Dive into NVIDIA resources for Universal Scene Description and check out our resources for game developers.

Register for NVIDIA GTC 2023 for free and join us March 20–23 to check out our game development sessions. In particular, you may want to attend our USD sessions, How USD Can Become the “HTML” of the Metaverse and An Overview of USD for Building Virtual Worlds.

For those attending GDC, see our sessions for game developers.

Discuss (0)

Tags