How Nvidia’s RTX Real-Time Ray Tracing Works

How Nvidia’s RTX Real-Time Ray Tracing Works

Since it was invented decades ago, ray tracing has been the holy grail for rendering computer graphics realistically. By tracing individual light rays as they bounce off multiple surfaces, it can faithfully recreate reflections, sub-surface scattering (when light penetrates through a bit of a surface like human skin, but scatters back as it goes), translucency, and other nuances that help make a scene compelling. It’s commonly used in feature films and advertising, but the large amount of processing time needed has kept it away from real-time applications like gaming.

Nvidia is aiming to change all that with RTX, its high-performance ray tracing library that can churn out detailed scenes at game-worthy frame rates. We’ll take a deeper look at what makes ray tracing hard, and Nvidia’s RTX.

How Ray Casting Turns Into Ray Tracing

Theoretically, ray tracing involves casting rays from each light source in a scene, generating (usually randomly) light rays from it, and following them as they hit and are reflected off surfaces. At each surface, the properties of the light are combined with the properties of the material it’s striking, and of course the angle at which it intersects. The light, which may have picked up a different color from reflecting off the object, is then traced further, using multiple rays that simulate the reflected light — thus the term ray tracing. The tracing process continues until the rays leave the scene.

While that process is great in theory, it’s incredibly time consuming, as most rays don’t hit anything we’re interested in, and other rays can bounce around nearly indefinitely. So real-world solutions make a clever optimization. They use a principle of light called reciprocity, which states the inverse of a light beam works the same way as the original to cast rays from from the virtual camera out into the scene. That means only rays which will contribute to the final scene are cast, greatly increasing efficiency. Those rays are then followed (traced) as they bounce around until they either hit a light source, or exit the scene. Even when they exit the scene, it could be at a point that adds light (like the sky), so either way the amount of illumination added to each surface the ray hits is added to the scene. The software may also limit how many reflections it will follow for a ray if the light contribution is likely to be small.

Nvidia CEO Jen-Hsun Huang explains how light baking has been used as a more efficient alternative to some of the capabilities available with ray tracing.
Nvidia CEO Jen-Hsun Huang explains how light baking has been used as a more efficient alternative to some of the capabilities available with ray tracing.

Ray Casting Is Massively Processor Intensive

The total number of rays of light falling on a scene is beyond what even the powerful computers used for ray tracing can model. Practically that means ray tracers have to pick a level of quality that determines how many rays are cast from each pixel of the scene in various directions. Then, after calculating where each ray intersects an object in the scene, it needs to follow some number of rays out from each intersection to model reflected light. Calculating those intersections is relatively expensive. Until recently it was also limited to CPUs. Moving it to the GPU, like some modern ray tracers are able to do, has provided a major improvement in speed, but Hollywood-quality results still requires nearly 10 hours per frame on a high-end multi-GPU mini-supercomputer.

Because ray tracing is so processor intensive, interactive applications like games have been unable to use it except to generate compelling scenes in advance. For real-time rendering they rely on rasterization, where each object’s surfaces are shaded based on their material properties and which lights fall on them. Clever optimizations such as light baking help make it possible for games to render at high frame rates while still looking great. But they fall short when it comes to subtle interactions like sub-surface scattering. So for the ultimate in recreating realistic scenes, the goal has always been real-time ray tracing.

How Nvidia Has Done It

How Nvidia’s RTX Real-Time Ray Tracing Works

Also, Nvidia touts a fancy new denoising module. Denoising is very important in ray tracing, because you can only cast a limited number of rays from each pixel in the virtual camera. So unless you leave your ray tracer running long enough to fill in the scene, you have a lot of unpleasant looking “bald spots” or “noise.” There are some great research projects that help optimize which rays are cast, but you still wind up with a lot of noise. If that noise can be reduced separately, you can produce quality output much faster than if you need to address the issue by sending out that many more rays. Nvidia uses this technique to help it create frames more quickly. It is likely this capability builds on the AI-powered work Nvidia presented at SIGGRAPH last year.

RTX features Acceleration Objects, so there may be other clever tricks it is using. For example, some ray tracing accelerators cache the current intersections, and do relatively simple transforms to predict where they will occur in the next frame. That’s quick and often invisible to the human eye, but not technically precise. The on-stage demo, while fancy, also featured only synthetic objects, so we don’t know whether RTX is currently up to the challenge of real-time rendering of human faces, for example. So while I’m sure we’ll see some really cool gaming applications, those looking to use RTX for scientific research will need to make sure it will generate the accurate output they need.

A dynamic scene from a prototype using Microsoft DXR, Unreal Engine 4, and Nvidia RTX
A dynamic scene from a prototype using Microsoft DXR, Unreal Engine 4, and Nvidia RTX

No, You Can’t Have RTX, At Least Not Yet

As impressive as the gaming demos are, you can’t take advantage of them yet. RTX requires a Volta-family GPU, and none of those have been introduced for the consumer market yet. Nvidia and gaming companies claim they will have RTX-enabled games by the end of this year, but there’s no word on when you’ll be able to run them. I could see having them hosted on a high-powered cloud gaming service, but the rental rates for the needed Voltas to run the game are probably beyond the budget of most gamers. Initially, it is also likely ray-traced elements will appear as specific features mixed into games, as a treat for those who can see them, rather than being used to create entire games, since the vast majority of gamers won’t be able to run RTX for quite a while.

Continue reading

Minecraft With Ray Tracing Now Available for All Windows 10 Players
Minecraft With Ray Tracing Now Available for All Windows 10 Players

You don't usually think of Minecraft as a realistic game, but the developers have been hard at work adding RTX ray tracing to the game for the last eight months. It's finally out of beta today, and it really works with the blocky look of Minecraft.

Quake II RTX Now Runs on AMD GPUs Thanks to Vulkan Ray Tracing
Quake II RTX Now Runs on AMD GPUs Thanks to Vulkan Ray Tracing

Nvidia's Quake II RTX now runs on AMD GPUs using Vulkan, if you've got the right driver (and an RX 6000).

Someone Hacked Ray Tracing Into the SNES
Someone Hacked Ray Tracing Into the SNES

Surely, a game console from the 90s couldn't support ray tracing, right? Wrong. Game developer and engineer Ben Carter hacked ray tracing into the Super NES with a little help from an FPGA dev board.

New Joint AMD-Samsung Radeon GPU: Ray Tracing, Variable Rate Shading
New Joint AMD-Samsung Radeon GPU: Ray Tracing, Variable Rate Shading

AMD has confirmed that its partnership with Samsung is still on track, with Radeon-infused mobile GPUs expected in 2022 flagship devices.