Nvidia May Be Building New Turing GPUs Without Ray Tracing

Nvidia May Be Building New Turing GPUs Without Ray Tracing

There’s been a lot of reporting on the idea that Nvidia is preparing to launch a set of GPUs without ray tracing support over the past few months, including new rumors that broke before Christmas making the same claims. It’s a good time to round them up and look at the possibilities.

As always, rumors and speculation should be taken with a grain of salt. Speculation about rumors and speculation should be taken with several more.

The primary new rumor is from Videocardz, which claims that Nvidia is either preparing to launch a GTX 1160 or possibly a 1660 Ti (Expreview reports 1660 Ti behind a password-locked article, while Videocardz has heard 1160 itself). The rumor is straightforward enough: Nvidia will split the RTX branding into GTX and RTX flavors, with GTX cards lacking ray tracing support. The new GPUs would still be based on the Turing architecture but would carry new model numbers. It is not clear if they would be Turing-class GPUs with the RTX capabilities fused off or whether they’d use entirely different die. It also isn’t clear if Nvidia would maintain a GTX alternative entirely up the stack. Which strategies make the most sense depends on the starting assumptions you make for why Nvidia would do such a thing. One thing we will say is that it probably makes good sense for Nvidia to split the cards by model number and abbreviation rather than just abbreviation. Either “GTX 1160” or “GTX 1660 Ti” is much less likely to be confused with a hypothetical RTX 2060 than a GTX 2060 / RTX 2060 co-branding effort.

Supposed marketing material. From Expreview, via Videocardz.
Supposed marketing material. From Expreview, via Videocardz.

Past that, things are unclear. If Nvidia is trying to provide a short-term alternative for gamers who don’t want to step up to more expensive price points or who aren’t yet certain ray tracing has a future, it might make sense to recycle bad RTX chips with non-functional ray tracing components into GTX GPUs instead. Alternately, it might do a single new GPU without RTX functions and use this card for the GTX 1060 and 1050 Ti replacements. Videocardz thinks this is one possibility and we’d agree, since RTX functions likely aren’t useful above a certain point and there’s no sense in building a GPU architecture into a chip if people aren’t going to be able to make effective use of it.

But there’s also another possible outcome here. I want to stress that this is entirely speculation on my part, but it’s possible that Nvidia intends to market RTX as an up-market feature in its own right, similar to how Nvidia has G-Sync displays and has previously sold features like PhysX. In this scenario, the GTX brand becomes the ‘lower-end’ branding on a longer-term basis with a variable crossover point depending on where the sweet spot is. In other words, we might have a GTX 1160 and RTX 2060 that offered equivalent rasterization performance, but with the 1160 occupying a lower price point and the RTX 2060 offering what Nvidia believes are forward-looking features.

Even if Nvidia goes this route, it might not carry the project up to the highest end of the stack, if only to avoid sabotaging its own product lines. Few people would buy an RTX 2080 at $700 if an RTX 2080-equivalent sans ray tracing was available for $500. But we’ve also seen Nvidia experimenting with ways to shove GPU prices higher by launching its Founders Edition, and it’s not crazy to imagine that the company could choose to explore this type of differentiation as well. If the difference between the RTX and GTX families was in the $50 – $100 range, it would give NV an opportunity to explore what people were willing to pay for ray tracing as a premium feature without sacrificing all of the additional revenue it wants to earn from RTX cards.

There’s a big downside to this argument that I’ll go ahead and acknowledge: It would make it even harder to push RTX into the mainstream of GPU technology. Right now, the only known games planning to include RTX are Nvidia launch partners. Microsoft’s DXR standard for ray tracing is now part of DirectX, but if you consider the wider market including consoles — and game developers absolutely do when they decide which features to build into games — it’s mostly a market without ray tracing. None of the rumors around Navi suggest it supports ray tracing with specialized hardware, and the compute costs of the feature are high enough that it’s unlikely to be a major focus without additional dedicated resources.

The largest single reason we didn’t recommend buying into RTX at launch and continue to caution against it today is that new features don’t have a great track record when it comes to quick market adoption. That’s not a dig at NV, it’s just a fact. Software lags hardware, sometimes by 4-6 years. And most of Nvidia’s feature introductions, if you think about it, aren’t aimed at driving market adoption of an entire new feature / capability, they’re aimed at driving adoption of a specific Nvidia premium technology. PhysX. G-Sync. Even Nvidia’s largest marketing efforts around tessellation, which was a common technology, were partnerships with game developers willing to pour so much tessellation into their games that it became a systemic disadvantage to AMD GPUs, thereby demonstrating the supposed superiority of the NV card.

In other words, if you start from the perspective of a company attempting to drive mass market adoption of a feature, it makes no sense that NV would maintain any kind of RTX / GTX crossover. If you start from the perspective of a company trying to create premium experiences that its most well-heeled customers will pay premiums for, the idea of a differentiated RTX/GTX stack long-term makes more sense, though we make no predictions about how high up the stack NV would practically push this.

I’d argue that the second approach — RTX as a premium feature for those willing to pay — actually aligns better with the way Nvidia has historically rolled out features. Even the argument that this exposes the company to weakness if a competitor comes along with a cheaper part is only as true as the presence of a competitor to launch such a GPU in the first place. With AMD currently fielding nothing to compete against Turing and Intel’s GPU efforts likely a year away, Nvidia may have seized on its current position at the top of the stack to introduce premium features as a short-term profit driver in a manner that it would never have attempted had AMD been offering better competition in the first place.

Which is true, if any? We don’t know yet. But with the RTX 2060 reportedly arriving in January, we’ll know more soon.

Continue reading

Activist Firm Urges Intel to ‘Explore Alternatives’ to Manufacturing Its Own Chips
Activist Firm Urges Intel to ‘Explore Alternatives’ to Manufacturing Its Own Chips

Intel is facing calls from an activist investor to explore strategic alternatives and potential for a spinoff or divestment of previous acquisitions.

Intel Urged to ‘Explore Alternatives’ to Manufacturing Its Own Chips
Intel Urged to ‘Explore Alternatives’ to Manufacturing Its Own Chips

Intel is facing calls from an activist investor to explore strategic alternatives and potential for a spinoff or divestment of previous acquisitions.

Is the Turing Test Obsolete?
Is the Turing Test Obsolete?

Is the Turing Test obsolete? The head of Alexa development argues that it is.

Intel May Postpone Manufacturing Decision, Emphasizes Beating Apple
Intel May Postpone Manufacturing Decision, Emphasizes Beating Apple

Intel may postpone its decision on whether to build 7nm CPUs in its own fabs or to partner with an outside foundry for future leading-edge nodes. Also, Pat Gelsinger has some harsh words for Apple.