Intel Confirms It Intends to Support VESA Adaptive Sync in Future GPUs

Intel Confirms It Intends to Support VESA Adaptive Sync in Future GPUs

Adaptive Sync — branded as FreeSync by AMD and G-Sync by Nvidia — is a technique for improving perceived GPU performance by ensuring that each frame of data is presented to the monitor as soon as it’s ready. Absent this approach, games will either tear (if V-Sync is disabled) creating a noticeable line across the screen, or they may repeat frames of animation while waiting for a frame that isn’t finished yet. Displaying the same frame of animation multiple times in a row creates a visible stutter effect, as the game will skip the frame it didn’t have a chance to display and move ahead with the next frame.

Click to enlarge
Click to enlarge

This jump between frames reads visually as stuttering. Companies like AMD and Nvidia have built solutions that avoid this problem, but the state of their ecosystems is rather different. Specifically, G-Sync typically carries a significant price premium ($100 – $150 is the figure we’ve heard in the past). AMD’s FreeSync support, in contrast, is free. This has a practical impact on the number of displays available, with 180 new FreeSync panels listed as being for sale at Newegg, compared to just 38 G-Sync panels. The cheapest FreeSync panel is $110, the cheapest G-Sync panel is $329 (though in fairness, it’s also a much nicer monitor).

Three years ago, Intel promised it would support AMD’s FreeSync (also known as VESA Adaptive Sync) in future products — and promptly didn’t. But earlier today, a potential fan of Intel’s discrete graphics efforts asked Intel’s Chris Hook, Director of Marketing, Discrete Graphics and Visual Technologies, whether Intel would still be keeping to its plan to support Adaptive Sync. Hook responds that yes, it will.

Image by reddit user dylan522p
Image by reddit user dylan522p

This is an important development for several reasons. Let’s be clear about something up front. When Nvidia first developed G-Sync, it had to use custom silicon to introduce the capability because desktop monitors didn’t use controllers that would have accepted variable refresh rates at all. By building its own silicon, Nvidia was able to bring this capability to market before AMD and VESA.

But those times have changed. Today, there is no reason for Nvidia to continue to push its own dedicated solution except for the fact that it dominates the gaming market and can therefore apply pressure to push gamers towards branded monitors that generate more money for the company. In fact, there’s no reason why Nvidia can’t support Adaptive Sync on its existing hardware, right now — except that it doesn’t really want to. And since it isn’t being forced to do so by robust market competition, we have an absurd situation in which you have your choice of nearly 200 displays at every price point and feature level known to man — and they won’t be compatible with the GPUs that most enthusiasts are buying.

That’s going to be the status quo for a little while longer, but a determined push from Intel to support Adaptive Sync could make the long-term difference in pushing Nvidia to offer support for something besides G-Sync. Obviously, much depends on the final quality of Intel’s graphics solutions — if the cards aren’t good enough to win market share, they’re not going to do anything to help the adoption of Adaptive Sync. But assuming that they are good enough to win Intel some space in the graphics business, having two companies backing an open standard that any company can use is intrinsically better than having a standard dedicated to each company. And while Intel’s original comments in 2015 only applied to future integrated GPUs from the company, it seems unlikely that Intel would add support for Adaptive Sync to its integrated parts but refuse to support the same standard when it launches its desktop (and mobile?) discrete GPUs.

It’s ridiculous that at the moment, you have to choose your monitor based not just on which GPU you prefer today, but on which GPU you think you’ll prefer in 3-5 years. It’s great for Nvidia. It’s not great for anybody else. And it’s an entirely artificial barrier — Nvidia has never pointed to a single benefit of G-Sync provided by their specific and particular silicon solution. It absolutely served a purpose at the time of introduction, but that time is over and it’s time for G-Sync to follow suit. Hopefully Intel, when it launches its GPUs, will help drive that shift, and we can all look forward to just buying a GPU and knowing it’ll fully work with whatever monitor you plunk down in front of it.

Continue reading

Scientists Confirm the Presence of Water on the Moon
Scientists Confirm the Presence of Water on the Moon

Scientists have confirmed the discovery of molecular water on the moon. Is there any of it in a form we can use? That's less clear.

Asteroid Mission Successfully Collected Samples, Japan Confirms
Asteroid Mission Successfully Collected Samples, Japan Confirms

Japan's Hayabusa2 mission wrapped up last week when the sample container parachuted down in Australia. The mission certainly looked like a success at every step along the way, but the true test is whether or not it collected the sample it flew out there to get.

Intel Rocket Lake Desktop CPUs Will Launch in March, Gigabyte Confirms
Intel Rocket Lake Desktop CPUs Will Launch in March, Gigabyte Confirms

Gigabyte has announced full PCIe 4.0 support for its Z490 motherboards when paired with 11th Gen Intel CPUs, and confirmed that Rocket Lake will debut at the end of March.

Pretty Sure We’re Prepared This Time: Blizzard Confirms Burning Crusade Classic
Pretty Sure We’re Prepared This Time: Blizzard Confirms Burning Crusade Classic

Blizzard has confirmed that World of Warcraft Classic is getting a partner. Step through the Dark Portal once again when The Burning Crusade Classic comes online.