May 21, 2018
Technology sneak peek: real-time ray tracing with Unreal Engine
In offline renderers, ray tracing has long been used to achieve photoreal results, particularly in scenes with a lot of reflections and with complex translucent materials such as thick glass. Now, leveraging the efforts of several technology companies, support for Microsoft's DXR API and ray-traced area light shadows will be available in the main branch of Unreal Engine 4 by the end of 2018. Epic expects to release binary support for these features in Unreal Engine 4.22, with more features to follow as they become production-ready.
What’s the big deal?
Renderers tend to use one of two methods for determining pixel color: rasterization or ray tracing. Rasterization starts with a particular pixel and asks, “What color should this pixel be?” Ray tracing works from the viewing angle and light sources and asks, “What is the light doing?”
Ray tracing works by tracing the path of a light ray as it bounces around a scene. Each time a ray bounces, it mimics real-life light by depositing color from earlier objects it has struck, and also loses intensity. This depositing of color makes the sharp reflections and subtle, realistic color variations that, for certain types of materials and effects, can only be achieved with ray tracing.
Conversely, rasterization is ray-tracing’s faster, cheaper cousin, calculating lighting with a number of approximations. Unreal Engine’s rasterization delivers 4K resolution frames in a matter of milliseconds on a fast GPU. Rasterization can closely approximate photorealism without necessarily being physically precise, and the tradeoff on speed has been well worth it for most Unreal Engine customers.
However, for live-action film and architectural visualization projects where photorealism is valued over performance, ray tracing gives an extra level of fidelity that is difficult, if not impossible, to achieve with rasterization. Until now, ray tracing was implemented only in offline rendering due to its intense computational demands. Scenes computed with ray tracing can take many minutes to hours to compute a single frame, twenty-four of which are needed to fill a single second of a film animation.
“Traditionally, ray tracing has been used in a number of ways in production pipelines,” says Juan Canada, Lead Ray Tracing Programmer at Epic Games.“Even if a production plans to use raster rendering to achieve real time at high resolutions, they might render stills and animations with ray tracing to use as reference, to give artists a goal or comparison for raster, even for final pixels in some cases.
“If ray tracing can be done in real time, these ‘ground-truth’ reference renderings can be made much faster. This means there can be more of them, or several iterations to get things just right, instead of a handful of images that took hours or days to render.”
Canada adds that having access to ray tracing “makes the whole production pipeline easier to control, because you can get the final result more easily.”
The technology
Real-time ray tracing (previously an oxymoron) has been achieved by assembling massive amounts of computational capabilities using the latest GPUs and machine learning techniques and marrying them with rasterization techniques when ray tracing isn’t needed. The combination of hardware and software simply hasn’t been available until now, and is currently possible only because of the joint efforts of several committed companies.
Developers for Windows are familiar with Microsoft’s DirectX API, which has long been utilized by media-heavy Windows applications, including game engines, to optimize the display of graphics. Microsoft recently developed the new DXR (DirectX Raytracing) framework, an API working through the current DirectX 12.
Concurrently, NVIDIA announced NVIDIA RTX, a ray-tracing technology that runs on NVIDIA Volta architecture GPUs. NVIDIA partnered with Microsoft to enable full RTX support via the Microsoft DXR API.
Working with both Microsoft DXR and NVIDIA RTX technologies, Epic utilized this pipeline to create the Reflections real-time cinematic.
The demo
For the announcement at GDC 2018, Epic set out to create a demo that would showcase the full glory of real-time ray tracing. Epic collaborated with NVIDIA’s ray tracing experts along with ILMxLAB, Lucasfilm's immersive entertainment division.
To make the demo, the team started with high-quality reference footage from ILMxLAB and combined forces to create the necessary models. Artists carefully recreated every detail, from tiny scratches on the stormtrooper armor to the weave pattern of Phasma’s cloak. Materials were built in a way that would have them respond naturally to light.
Set construction and shot work was assembled directly in Unreal Engine. Epic used their own virtual production techniques to shoot the performance while the actor/director performed in Virtual Reality.
Most of this technology already exists within Unreal Engine. But now with ray tracing, new effects can really breathe life into the scene. The dynamic and accurate reflections on Captain Phasma’s armor, a curved, mirror-like surface, were created with ray tracing.
The Reflections demo is running at 1080p Resolution at 24 frames per second in Unreal Engine. The calculations for reflections, ambient occlusion and the shadows from the textured area lights are ray traced in real time. With ray tracing, challenging effects such as soft shadows and glossy reflections come naturally, greatly improving the realism of the scenes.
The team at Epic collaborated with NVIDIA’s engineers to utilize a number of techniques to achieve realism with ray tracing including denoising and reduction of detail in reflections. A deep-dive talk on these techniques, challenges, and trade-offs was given at GDC 2018.
The future
For the initial release, Unreal Engine’s renderer will address processes that are the most challenging to simulate with raster rendering but are straightforward with ray tracing.
“We’re going to start integrating some of the features shown at GDC, like using ray-tracing for accurate area shadows and a new cinematic post-process for depth of field,” says Canada, “and we’ll add new components that allow us to render other effects such as refraction, and go forward from there.”
What does this mean for the future of filmmaking? Ray tracing technology has been used in movies and architectural visualization for many years, and has become the standard for generating photoreal images. Those images can take several hours to render with an offline renderer, but now with the ray tracing implemented in Unreal Engine, these images can be rendered in real time.
It means that directors, artists, and designers will be able to iterate faster, and try new lighting scenarios and camera angles. Ultimately, it will bring down the cost of production and make it possible to produce more interactive content.
The Reflections GDC demo ran on a DGX Station equipped with four Volta GPUs using NVIDIA RTX technology. For now, you will need similar horsepower to leverage real-time ray tracing. In the future, we can rely on Moore’s law to bring down the cost and complexity of how this technology is packaged and delivered. DXR performance is resolution-dependent, just like offline rendering, however, because Unreal Engine is still performing its own rendering, performance will be very scene-dependent.
While the Reflections demo runs on pretty heavy-duty hardware, Epic sees the demo as a first step rather than the end game. Currently, NVIDIA offers the DXR-enabled hardware used for the real-time ray tracing demo, but AMD has announced that they will make DXR drivers available at some point, too.
“We’re going to keep working on it to make it more lightweight and accessible,” says Sebastien Miglio, Director, Enterprise Engineering at Epic. “It’s early days, but within a short period of time we expect that ray tracing will run on more common types of computers found in any studio.”
“The Reflections effort demonstrates Epic’s commitment to leading the future, not only of games, but wherever high-fidelity immersive experiences are required.” said Miglio. “Unreal Engine set the bar long ago for fidelity and performance. Now, with DXR, we’re raising that bar even higher. Customers working with Unreal Engine have the assurance that the road they travel with us leads to the future in the most direct path possible.”
Ready to try out real-time ray tracing in Unreal Engine? Download it for free today.