Next week, Nvidia is holding a major event at Gamescom to tease the launch of a new GTX — or apparently, RTX — 2080 high-end GPU based on its just-announced Turing GPU architecture. As part of its Turing architectural unveil last night, NV published a teaser video for the next rollout. The video contains a number of clues to the naming convention, including a user named “Not_11” saying “Eating, give me 20.” Add up the screenshots seen on Discord, and the brand name — RTX 2080 — comes into focus, even if Nvidia hasn’t formally, officially declared it yet.
So far, Nvidia has focused on talking up its RTX cores as a major force to be reckoned with and a sea change in where gaming and professional graphics could be headed next. As we alluded to last night, it’s not the first time we’ve seen a company take a major bet on a new rendering technology and hardware to accelerate it. Nvidia’s strength and market share in graphics make it possible that the company will push a new wave of capabilities into the market in a way as fundamentally different to graphics as the advent of the G80 back in 2006 (Nvidia itself has explicitly made this comparison). But if I seem dubious, it’s because, well, the history of graphics and computing, in general, doesn’t favor the fast sea change.
The debut of Vulkan and DirectX 12 was supposed to revolutionize gaming, yet Vulkan and DX12 are used today by a bare handful of titles. Performance gains over DX11 have not generally materialized as hoped for. There are a variety of reasons for this, but one of the most important is that hardware refresh cycles are slow these days, it takes time to update both hardware and software to support new APIs, and taking advantage of the capabilities of those APIs can be more difficult in some cases than the previous, less-optimized solution. Even so, if you date the appearance of low-level APIs in PC gaming to Mantle back in 2013 — and we should — it’s now been 5.5 years since AMD first introduced a major new type of API. And games using those APIs are few and far between.
This is far from the only example of this kind of trend. VR remains bottlenecked and available to a scant handful of consumers even after Nvidia put a huge push behind it in 2015 – 2016. Both Nvidia and AMD put a huge push behind 3D gaming and multi-monitor gaming, respectively, back in 2012 – 2013. In both cases, just a handful of gamers bought into these capabilities and features.
It’ll be damn interesting to see what Nvidia has in store and why it thinks ray tracing is the future today. But that future will take a few years to arrive, no matter what the RTX 2080 offers in terms of specialized ray tracing hardware.
Should Spectre, Meltdown Be the Death Knell for the x86 Standard?
Spectre and Meltdown are serious CPU flaws, but do they warrant throwing out the entire closed-source CPU model?
Lasers Used to Create Negative Mass Particles
Researchers at the University of Rochester have worked out a way to create negative mass particles using, what else, lasers. Is there anything lasers can't do?
Nvidia Goes All-In On G-Sync With New ‘BFGD’ Ultra-High-End Displays
Nvidia is bringing some of the highest-end displays imaginable to market in 2018, with 4K panels, 120Hz refresh rates, low latency displays, integrated Nvidia Shields, and support for 1,000 nits of brightness in HDR. Yowza.
Huawei’s Phone Deal With AT&T Reportedly Killed On Account of Politics
The upcoming (and unannounced) deal with AT&T to sell the new Mate 10 series was supposed to be the start of Huawei's push into North America, but the deal has reportedly fallen apart at the last minute after AT&T got cold feet, and some sources point to a political cause.