Yesterday, I argued that Nvidia fans and customers should hold off on buying a new RTX GPU until Nvidia had actually demonstrated how much of a performance improvement these GPUs would deliver in current shipping titles. The company, at that point, simply had not demonstrated its new cards would offer enough additional performance to make pre-ordering one a wise decision.
Nvidia just released a new set of 4K results that claim to show a 50 percent gap between the Nvidia GTX 1080 and the Turing-based RTX 2080. This is theoretically the kind of data we need to draw some conclusions about relative performance uplift between the RTX and GTX families. Unfortunately, the only thing this data does is strengthen the argument that Nvidia isn’t bringing much to the table for anyone but well-heeled enthusiasts. Let’s look at why.
The Importance of Price
Your results depend a bit on which starting numbers you choose since these cards have a range of price points. The current best-case for the GTX 1080 – RTX 2080 comparison is that the RTX 2080 is 1.55x more expensive and delivers ~1.5x more performance. If you have to buy the FE or an equivalent OEM card to see those performance numbers, then you’d be paying at least 1.77x more money for 1.5x performance. Yes, the final product is faster — but you’re also paying considerably more for it. And “More expensive hardware delivers better performance at an increasingly bad price/performance ratio” isn’t the type of headline that gets people excited. It’s also absolutely no help to people who might be able to swing spending $450 – $550 on a new card but just can’t afford to step up to the $700 – $800 mark.
After Nvidia’s price hikes, the RTX 2080 is no longer the appropriate point of comparison for the GTX 1080. So what happens when we compare the RTX 2080 with its actual competitor, the $700 GTX 1080 Ti? It just so happens that [H]ardOCP recently published an article comparing the GTX 1080 against the 1080 Ti. So let’s look at that data set and estimate how much the 1080 Ti would slice into Nvidia’s results.
[H]ardOCP tested Crysis 3, Tomb Raider, GTA V, Witcher 3, Fallout 4, Rise of the Tomb Raider, Doom, Deus Ex Mankind Divided, Battlefield 1, Sniper Elite 4, Mass Effect Andromeda, Kingdom Come Deliverance, and Far Cry 5. The GTX 1080 Ti is, on average, 1.27x faster than the GTX 1080. If we assume that those averages hold across the ecosystem, we can expect the RTX 2080 to be roughly 1.23x faster than the GTX 1080 Ti at the same price point. There could be some shifts depending on resolution and detail level differences, but we’d expect those to favor the 1080 Ti if anything. Any bottleneck that specifically slowed the GTX 1080 would give the 1080 Ti’s additional horsepower more room to shine.
There’s nothing wrong with a 1.2x performance boost, but Nvidia knows it’s not the kind of thing that’ll get gamers talking. It’s certainly not the kind of improvement that gets someone to rush out and replace a 1080 Ti they just bought within the past 18 months. So instead of acknowledging this point, they elided it by comparing the GTX 1080 to a much more expensive GPU that it wouldn’t normally compete against. When you set the competition appropriately, up to half of Nvidia’s claimed performance improvement may vanish.
Another point to keep in mind is that Nvidia is claiming that this data represents the GTX 1080, RTX 2080, and RTX 2080 with DLSS enabled. This implies that the other GPUs are both using some kind of performance-impacting AA solution to start with. That’s the only way the 2080’s performance would improve by using AI for antialiasing, as opposed to whatever method it had previously been using. There’s nothing wrong with using AA when analyzing GPUs; virtually all of our GPU testing is conducted with antialiasing enabled. But readers should be aware that using AA can change relative performance between two solutions, and that since we have no idea what AA methods were used in the games Nvidia lists, we don’t know how having it enabled would impact performance between Turing and Pascal. The size of the increase from baseline 2080 to 2080 + DLSS suggests that the initially deployed solution was quite rigorous.
Final point: None of these tests appear to touch the ray tracing side of Nvidia’s new capabilities. The only feature being tested and showcased here is the Deep Learning Super-Sampling. We don’t yet know what kind of a performance hit Turing takes in games with RTX enabled versus disabled.
Thus far, the debut of the RTX 2080 mostly feels like Nvidia attempting to justify the price increases it’s slapping on its video cards. We’ve graphed the launch price of Nvidia’s flagship models going all the way back to Fermi. It makes for some interesting reading:
For four generations, (GTX 295 – GTX 680) Nvidia kept the same $500 price for its flagship card. The GTX 780 surged up to $649 for launch but fell lower in six months thanks to AMD’s Hawaii-based R9 290 and R9 290X. Maxwell tacked a modest $50 increase on top-end price, but nothing crazy. Beginning in 2016, however, Nvidia began aggressively charging more, especially if you bought a Founders card. If you want to buy an RTX 2080 card in 2018 the way you bought a GTX 980 card in 2014, Nvidia wants an extra $150 – $250 for the privilege. It’s a far cry from what we used to see, just a few years ago, when Nvidia brought dramatically improved performance to the same price points year-on-year, even at the top of the market. AMD’s difficulty competing at the top of the GPU market is reflected in Nvidia’s pricing. At the rate costs are increasing, Intel’s 2020 GPUs can’t come soon enough.
Based on the results we’ve seen to date, Nvidia’s RTX 2080 looks to deliver between 1.15x – 1.3x additional performance compared with the GTX 1080 Ti at the same price point in mainstream titles that do not take advantage of its features. The claims of 50 percent-plus improvements do not withstand scrutiny given the difference in price between the two solutions. As always, all of this analysis should be considered preliminary and speculative based on publicly available information, but not final hardware. It’s possible that other, as-yet-undisclosed enhancements to the GPU core could impact the final analysis.
Game Mod Developer Caught Deliberately Distributing Malware
While the company has since apologized, comments by the studio head suggest he still doesn't understand the magnitude of his own screw-up.
Ex-Microsoft Intern: Google Deliberately Crippled Edge Browser
Google may have already started using its browser dominance to disadvantage its competitors. That's no kind of good sign, even if you prefer Chrome to any other browser.