Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

A few days ago, one of our readers raised the idea that G-Sync had a performance penalty associated with using it. It was a bit surprising to see — Nvidia has previously assured me that there’s no performance penalty for using G-Sync — but the reports on the Nvidia forums include users with benchmark results comparing G-Sync On to G-Sync Off. There are extensive reports and they date back for months, including posts in the Nvidia reddit community.

The performance declines reported in the forums appear to be tied to SLI. There’s some ambiguity about how long the problem has existed — I’ve seen some people claim it’s tied to drivers released after the 391.xx family, and others who said the issue has existed for the entire time Nvidia has had Pascal GPUs on the market. The issue is simple: According to posts on Nvidia’s forums and in the Nvidia subreddit, activating G-Sync at the same time as SLI results in significant performance drops in many games.

SLI is Nvidia’s technology for using more than one GPU to render a scene at the same time, while G-Sync is a technology that smoothes frame rates compared with standard V-Sync by ensuring that the monitor is synchronized to the GPU’s refresh rate and presents a frame as quickly as the display is capable of delivering it. And while I can’t speculate on exactly what the problem is, we can intuit a reason: Timing.

Both G-Sync and SLI have major timing implications. Keeping the GPU and monitor running in sync takes time. Moving data from one GPU to another and then displaying the frames rendered by that same GPU takes time. If you target a 30fps frame rate, you need to deliver a new frame every 33.3ms. If you want to run at 60fps, that means you need a new frame every 16.6ms. If you target 120fps, that means you need a new frame every 8.3ms. At a certain point, you’ve got a very limited window with which to work.

That’s our theory, at least, after spending some time exploring this issue. To start, let’s look at the results user octiceps posted on the Nvidia forums:

Data by Nvidia users.
Data by Nvidia users.

There are some huge drops in this graph. Rising Storm 2 going from over 200fps to just 75? And then, conversely, there are some very small gaps. A 10 percent drop in Witcher 3 is still significant — nobody wants to drop frames — but if we’re being honest, it’s not a gap you’d notice unless you were running a benchmark or knew a specific area with a mammoth frame drop in it.

I decided to test the situation myself with a pair of GTX 1080 GPUs, the highest-end cards I have in a matched pair. The optics of the issue are pretty bad. While there aren’t many SLI owners, much less SLI owners with G-Sync monitors, the few that exist likely represent the biggest consumer spenders in the enthusiast ecosystem. If you bought into G-Sync a few years ago, you could easily have dropped $600 per GTX 1080 (or $700 per 1080 Ti) plus a $500 monitor. Given that the most common Steam Survey configuration is a GTX 1060 paired with a 1080p monitor, these are people dropping serious cash on Nvidia hardware.

Test Setup

There’s a substantial configuration difference, I suspect, between the G-Sync monitor we have available and the hardware these enthusiasts are using. The only G-Sync monitor I have is an Acer XB280HK, and while it’s 4K capable, it supports a maximum refresh rate of 60Hz. This means that by definition, I can’t use G-Sync in tests that hit frame rates higher than 60Hz. If this is a timing issue that only occurs above that frame rate, I definitionally won’t see it.

Spoiler alert: It isn’t, and I did.

What made this easier to check is that our Core i7-8086K testbed we used for the RTX 2080 and 2080 Ti review is still fully assembled. The entire system was a brand-new Windows 10 install with just one Nvidia driver ever installed (411.63). It’s a pristine configuration — even the games were downloaded from scratch rather than copying over an archived Steam library.

We tested two games that were mentioned by enthusiasts as being specifically impacted and one title that was not. Our target frame rates were as close to 60fps as possible without going over it, and we modeled the behavior of various titles at different resolutions to test how much the problem appeared or disappeared depending on the GPU’s average frame rate. We also tested a host of other potential variables, including whether G-Sync was set to fullscreen-only or also active in borderless mode, and whether V-Sync was enabled or disabled. We observed no performance differences that weren’t otherwise explained by changing V-Sync itself. We also tested to see whether it made a difference to use DX11 versus DX12 in the titles that supported both.

We observed no performance impact whatsoever from changing G-Sync to operate in borderless mode versus only fullscreen mode, and we saw no performance change from changing between V-Sync enabled and V-Sync disabled. Keep in mind we only tested a few games — we were looking for potential evidence of a problem, not trying to perform an exhaustive evaluation of all the conditions under which G-Sync performance can vary on 10-15 games.

Because a pair of 1080s is, in some cases, capable of boosting modern games to above 60fps even at 4K, we also checked games that allow for supersampling. Finally, we ran these tests on both a single GTX 1080 Ti as well as a pair of GTX 1080s. All of the issues we observed appear to be unique to GPUs running in SLI. The 1080 Ti had no problem in any test configuration and we saw no difference between G-Sync enabled and G-Sync disabled when testing one GPU.

Deus Ex: Mankind Divided

Deus Ex: Mankind Divided shows a small-but-definite performance loss when G-Sync is enabled, as shown below. Interestingly, this gap appears only in DX11 mode. When we switched to DX12, it vanished.

Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together
Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

Far Cry 5

Because we have to keep the frame rate under 60fps at all times to see the true impact of turning G-Sync off or on, we tested FC5 in four different resolutions: 4K, 4K+1.1x supersampling, 4K+1.3x supersampling, and 4K+1.5x supersampling. The results and the performance gap between having G-Sync on versus G-Sync off are shown below:

Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

The closer the GPU is running to 60fps, the larger the gaps. As the frame rate decreases, the gap between the two solutions also decreases. This seems to imply that our timing theory could be correct. At the very least, our findings confirm Nvidia forum user’s shaunwalsh’s observation that “running Gsync lowers FPS by 10-20% depending on the title. I tested this in FC5 and Ghost Recon Wildlands.” Our results show that FC5 is between 1.09x and 1.20x faster with G-Sync disabled, depending on the resolution target. FC5 does not support DX12, so we were unable to test this mode.

Up to this point, we’ve seen two distinct findings.

1) Turning on DX12 may eliminate the penalty.2) The penalty size increases as the frame rate rises.

Got that? Good. We’re throwing it out the window.

Hitman (2016)

Hitman wasn’t listed as an impacted title, but I took it for a spin anyway. I’m glad I did. It provided a useful third example in our tests. Here are our Hitman results in both DX11 and DX12. As with Far Cry, we tested Hitman with 1.3x and 1.5x supersampling to push the frame rate completely into our monitor’s G-Sync range. This time, we had to use supersampling at 4K to start with, which is why you don’t see the static “4K” test results.

Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

The good news is, you can once again eliminate the DX11 performance penalty associated with using G-Sync and SLI at the same time. The bad news is, you’ve got to accept a 30fps frame rate to do it. In DX11, the performance hit is a static 1.27x in both supersampling modes. This bucks the trend we saw with FC5 and Deus Ex.

We were unable to benchmark games like Sleeping Dogs because we couldn’t pull the GPU frame rate down sufficiently to bring it below our 60fps requirement. We observed a 2fps difference in this title between having G-Sync enabled and disabled (71 fps versus 73 fps) but that could fall within a reasonable margin of error.

Nonetheless, the pattern here is clear. Turning G-Sync on and using SLI is not guaranteed to tank your frame rate. We also tested Metro Last Light Redux with a forced AFR2 rendering profile, and that game showed no performance drop at all between G-Sync enabled and disabled. Deus Ex Mankind Divided reported a small penalty that vanished in DX12, while Hitman takes too much of a performance hit in that mode when using SLI to ever justify it. Three games, three different performance models.

It’s always dicey to try and test forum reports, not because forum commenters are liars, but because most don’t provide sufficient technical information to be sure you’re reproducing a problem correctly. I don’t have an explanation for this issue at the moment and I realize that “timing problem” is extremely vague. It’s a theory that happens to fit some common-sense facts. Our results collectively suggest that the issue is real and that the performance gaps could be as large as Nvidia users say, particularly if they continue to worsen as frame rate increases.

Nvidia Needs to Get in Front of This

Nvidia GPU Performance Craters When G-Sync, SLI Are Used Together

But while it may not impact very many people, it very much impacts the people who have poured the most money into Nvidia’s products and who represent its biggest fans. If you’ve got a pair of GTX 1080 Tis and a G-Sync monitor, you likely spent nearly $2,000 on your GPU and display alone. Even a brace of GTX 1070s could have run $700 for the pair and another $422 for the cheapest Dell G-Sync display on Newegg. The minimum buy-in cost for this technology is over a grand, assuming folks bought at 2018 prices. That’s enough money to buy a new gaming PC, and your top-spending customers are dropping it on GPUs. Not taking care of your halo customers can be a very expensive mistake, and while there aren’t a ton of people talking about this topic on reddit, the conversation threads go back for months. A lot of folks have been both patient and resourceful in trying to collaborate, share information, and find workarounds.

If Nvidia can’t guarantee that SLI will be compatible with G-Sync without mild-to-massive performance losses and no way to predict in advance what you’ll see, it needs to communicate that fact. It’s entirely possible that the complex intersection between game engines, GPUs, and monitors creates situations in which G-Sync can’t be effectively employed without a massive performance hit no matter what Nvidia does. The fact that we see different performance models from different games suggests that the game engine plays a role here. There’s also the fact that G-Sync is intended to operate when frame rates are low and it makes its largest difference in that mode. The higher the frame rate, the smaller the impact of missing a V-Sync refresh cycle. If you’re playing at 30fps, missing a frame presentation means your current frame is displayed for ~66.6ms. If you play at 120fps, a missed presentation results in a frame repeat for 16.6ms. As the frame rate rises, the difference between using and not-using G-Sync (or FreeSync) shrinks. At a certain frame rate, you wouldn’t be able to tell the difference because not enough time has passed for you to detect it.

But if it’s true that SLI and G-Sync are more of a peanut butter-mayonnaise sandwich than a PB-chocolate combo, that needs to be communicated. That $422 monitor is only a 1440p display — and you can buy a 4K IPS panel without G-Sync for the same amount of money. Or a 21:9 ultra-wide monitor. Or a monitor with FreeSync support, which doesn’t come up with a premium. You get the point.

I don’t blame Nvidia for being unable to fix problems that aren’t of its making, or those mandated by the pesky laws of physics. But the company owes its customers an explanation for this behavior and an honest update about their ability to solve it, including the fact that it may never be solved, if that’s the case. The current situation reflects poorly on Nvidia.

For the record, it’s entirely possible AMD has this issue as well. If it does, the same applies. These are complex capabilities that end-users pay heavy premiums, in some cases, to access. All customers deserve honesty, period, without exception, but the customers poised to spend thousands of dollars on your products deserve to know whether they’re buying marketing claptrap or actual performance.

Continue reading

Intel Inside: Apple’s Next iPhone May Ditch Qualcomm Altogether
Intel Inside: Apple’s Next iPhone May Ditch Qualcomm Altogether

Qualcomm's legal fight with Apple could result in the company losing the iPhone altogether. Intel certainly hopes so.

Intel Wants to Leverage Optane to Eliminate Local PC Storage Altogether
Intel Wants to Leverage Optane to Eliminate Local PC Storage Altogether

Intel thinks that 5G connectivity and small Optane caches can replace local storage in the mainstream consumer market. Intel also has a new 5G modem expected to drop in 2019. There *might* be a connection.

Do FreeSync Displays, Nvidia GeForce GPUs Play Nice Together?
Do FreeSync Displays, Nvidia GeForce GPUs Play Nice Together?

When Nvidia announced its support for FreeSync, it raised the question that some displays might not play nicely with its GPUs. Early testing suggests this may not be a major issue.