Nvidia GeForce RTX 4090 Founders Edition: Mind-Blowing 4K Performance

Nvidia GeForce RTX 4090 Founders Edition: Mind-Blowing 4K Performance

Our fellow nerds at PCMag took the RTX 4090 Founders Edition for a spin on the test bench. The results paint a clear picture of pure domination across the board, with only a few small caveats. The big surprise was just how powerful the GPU is even without DLSS. We expected it would be a capable card, but the results are still eye-watering. And if you’re going with a partner board, you might even need a new case just to accommodate it, as they absolutely dwarf the Founders Edition board.

!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");

Starting with synthetic tests, the best word to describe the results is ridiculous. In Time Spy Extreme, it’s more than twice as fast as any existing GPU, including the RTX 3090. This is exactly what previous leaks indicated would be the case, by the way. It seemed fanciful at the time, but Nvidia has delivered. It’s also twice as fast as any other GPU in the ray-traced Port Royal benchmark. Incredibly, the 4090 is a hair shy of being three times as fast as the RTX 3080 in that test. It’s also seven times faster than the RTX 3080 in Furmark, which again, seems impossible. The level of dominance varies from test to test, but only by how much further ahead the RTX 4090 is from the rest of the GPUs.

!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");

Here’s where the rubber meets the road: AAA gaming with FSR/DLSS. Once again, in F1 2022, the RTX 4090 puts all other GPUs to shame. With DLSS disabled, it was still able to hit twice the frame rate at 4K as the RTX 3090. That’s mostly muscle too, as the 4090 has the same memory bandwidth as the RTX 3090 Ti, with no AI assistance. Compared with AMD’s top cards, well, there is no comparison. That’s the case at any resolution, too. With DLSS and/or FSR enabled, there are even more gains, but they’re not as huge as you might expect. This could be due to early drivers, however. Even without a huge boost, the RTX 4090 still towers over its competition. It also seems to be showing signs of being CPU-limited in Guardians of the Galaxy with FSR/DLSS enabled—again with a Core i9-12900K CPU, if you can believe that.

!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");

The beatdown continues with non-AI-enhanced AAA titles. In this scenario, the RTX 4090 is faster at 4K than some cards are at 1080p. For example, AMD’s Radeon RX 6800 XT ran at 129fps in 1080p, compared with the 4090’s 131fps 4K performance in RDR2. That’s not some midrange GPU either, but one rung below the top of its stack at launch. The 4090 is also CPU-limited in this game too, with its 1080p and 1440p scores being almost identical. You really need to be gaming at 4K to realize this GPU’s full potential, it seems. This batch of tests also revealed the first anomaly, which is its performance in Far Cry 5 was not much better than the competition at lower resolutions. It’s crazily CPU bottlenecked in this game, which is also an AMD-sponsored title. Its 4K performance is still head and shoulders beyond the rest, but at 1440p it’s merely equal.

!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");

Up above we mentioned there were some small caveats, and legacy gaming is one of them. We’re not sure why someone would spend this much money on a GPU to play older games. Still, it seems very CPU limited on older games. PCMag tested three older titles, and its performance was left wanting in all of them. Though it was still able to beat the AMD GPUs at 4K resolution, anything lower than that seemed to hit a 159fps ceiling. This caused it to simply match other GPUs, including the RTX 3090, even at 4K.

!function(e,i,n,s){var t="InfogramEmbeds",d=e.getElementsByTagName("script")[0];if(window[t]&&window[t].initialized)window[t].process&&window[t].process();else if(!e.getElementById(n)){var o=e.createElement("script");o.async=1,o.id=n,o.src="https://e.infogram.com/js/dist/embed-loader-min.js",d.parentNode.insertBefore(o,d)}}(document,0,"infogram-async");

For power and thermals, there’s good news and bad news. The good news is Nvidia’s cooler is excellent. It kept the card running cool and quiet the entire time, with a maximum temp of just 63C. That’s an impressive feat given the card’s horsepower. It does draw a lot of power, however. That’s not a surprise to anyone though, as it’s rated for 450W. Still, it managed to consume almost 100W less for the entire system than the RTX 3090. This seems to be the result of Ada Lovelace’s increased efficiency. Total system power under a gaming load was actually lower than that of both the 3090 and the 3080, which is surprising. It is not the fire-breathing nuclear reactor it was rumored to be months ago.

Nvidia seems to be pinning most of its grandiose performance claims on its third-generation DLSS technology. Dubbed DLSS 3, it’s not widely available yet, and only works with 35 games currently. However, it promises huge frame rate gains with minimal drawbacks in graphical quality and latency and is a 40-series exclusive. Unfortunately, the team didn’t have enough time to test it fully, but it does seem to offer some potential based on their limited time with it.

The bottom line is pretty clear: If you’re not gaming at 4K, the RTX 4090 seems unnecessary. Nvidia seems to have pulled out all the stops in an effort to prevent AMD from taking the crown this time around. We still don’t know what AMD has in store with its RDNA3 GPUs, but we’ll find out soon enough. The official launch is scheduled for Nov. 3. Nvidia is also launching its RTX 4080 16GB and 12GB GPUs “in November” as well.

Still, if you’ve got a 4K 120Hz monitor and want to play newer titles, this is the GPU you’ve been waiting for. Nvidia’s partner cards are coming out soon as well, so you should have plenty of options as long as bots don’t ruin the party. Benchmarks for partner boards are imminent, so stay tuned. Be sure to have your loan officer/kidney donation clinic on speed dial.

Continue reading

MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing
MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing

Nvidia's new RTX 3070 is a fabulous GPU at a good price, and the MSI RTX 3070 Gaming X Trio shows it off well.

Nvidia Will Mimic AMD’s Smart Access Memory on Ampere: Report
Nvidia Will Mimic AMD’s Smart Access Memory on Ampere: Report

AMD's Smart Access Memory hasn't even shipped yet, but Nvidia claims it can duplicate the feature.

Nvidia Unveils Ampere A100 80GB GPU With 2TB/s of Memory Bandwidth
Nvidia Unveils Ampere A100 80GB GPU With 2TB/s of Memory Bandwidth

Nvidia announced an 80GB Ampere A100 GPU this week, for AI software developers who really need some room to stretch their legs.

Nvidia, Google to Support Cloud Gaming on iPhone Via Web Apps
Nvidia, Google to Support Cloud Gaming on iPhone Via Web Apps

Both Nvidia and Google have announced iOS support for their respective cloud gaming platforms via progressive web applications. Apple can't block that.