Intel Confirms It Intends to Support VESA Adaptive Sync in Future GPUs

Intel Confirms It Intends to Support VESA Adaptive Sync in Future GPUs

Adaptive Sync — branded as FreeSync by AMD and G-Sync by Nvidia — is a technique for improving perceived GPU performance by ensuring that each frame of data is presented to the monitor as soon as it’s ready. Absent this approach, games will either tear (if V-Sync is disabled) creating a noticeable line across the screen, or they may repeat frames of animation while waiting for a frame that isn’t finished yet. Displaying the same frame of animation multiple times in a row creates a visible stutter effect, as the game will skip the frame it didn’t have a chance to display and move ahead with the next frame.

Click to enlarge
Click to enlarge

This jump between frames reads visually as stuttering. Companies like AMD and Nvidia have built solutions that avoid this problem, but the state of their ecosystems is rather different. Specifically, G-Sync typically carries a significant price premium ($100 – $150 is the figure we’ve heard in the past). AMD’s FreeSync support, in contrast, is free. This has a practical impact on the number of displays available, with 180 new FreeSync panels listed as being for sale at Newegg, compared to just 38 G-Sync panels. The cheapest FreeSync panel is $110, the cheapest G-Sync panel is $329 (though in fairness, it’s also a much nicer monitor).

Three years ago, Intel promised it would support AMD’s FreeSync (also known as VESA Adaptive Sync) in future products — and promptly didn’t. But earlier today, a potential fan of Intel’s discrete graphics efforts asked Intel’s Chris Hook, Director of Marketing, Discrete Graphics and Visual Technologies, whether Intel would still be keeping to its plan to support Adaptive Sync. Hook responds that yes, it will.

Image by reddit user dylan522p
Image by reddit user dylan522p

This is an important development for several reasons. Let’s be clear about something up front. When Nvidia first developed G-Sync, it had to use custom silicon to introduce the capability because desktop monitors didn’t use controllers that would have accepted variable refresh rates at all. By building its own silicon, Nvidia was able to bring this capability to market before AMD and VESA.

But those times have changed. Today, there is no reason for Nvidia to continue to push its own dedicated solution except for the fact that it dominates the gaming market and can therefore apply pressure to push gamers towards branded monitors that generate more money for the company. In fact, there’s no reason why Nvidia can’t support Adaptive Sync on its existing hardware, right now — except that it doesn’t really want to. And since it isn’t being forced to do so by robust market competition, we have an absurd situation in which you have your choice of nearly 200 displays at every price point and feature level known to man — and they won’t be compatible with the GPUs that most enthusiasts are buying.

That’s going to be the status quo for a little while longer, but a determined push from Intel to support Adaptive Sync could make the long-term difference in pushing Nvidia to offer support for something besides G-Sync. Obviously, much depends on the final quality of Intel’s graphics solutions — if the cards aren’t good enough to win market share, they’re not going to do anything to help the adoption of Adaptive Sync. But assuming that they are good enough to win Intel some space in the graphics business, having two companies backing an open standard that any company can use is intrinsically better than having a standard dedicated to each company. And while Intel’s original comments in 2015 only applied to future integrated GPUs from the company, it seems unlikely that Intel would add support for Adaptive Sync to its integrated parts but refuse to support the same standard when it launches its desktop (and mobile?) discrete GPUs.

It’s ridiculous that at the moment, you have to choose your monitor based not just on which GPU you prefer today, but on which GPU you think you’ll prefer in 3-5 years. It’s great for Nvidia. It’s not great for anybody else. And it’s an entirely artificial barrier — Nvidia has never pointed to a single benefit of G-Sync provided by their specific and particular silicon solution. It absolutely served a purpose at the time of introduction, but that time is over and it’s time for G-Sync to follow suit. Hopefully Intel, when it launches its GPUs, will help drive that shift, and we can all look forward to just buying a GPU and knowing it’ll fully work with whatever monitor you plunk down in front of it.

Continue reading

Intel Launches AMD Radeon-Powered CPUs
Intel Launches AMD Radeon-Powered CPUs

Intel's new Radeon+Kaby Lake hybrid CPUs are headed for store shelves. Here's how the SKUs break down and what you need to know.

NASA’s OSIRIS-REx Asteroid Sample Is Leaking into Space
NASA’s OSIRIS-REx Asteroid Sample Is Leaking into Space

NASA reports the probe grabbed so much regolith from the asteroid that it's leaking out of the collector. The team is now working to determine how best to keep the precious cargo from escaping.

Intel’s Raja Koduri to Present at Samsung Foundry’s Upcoming Conference
Intel’s Raja Koduri to Present at Samsung Foundry’s Upcoming Conference

Intel's Raja Koduri will speak at a Samsung foundry event this week — and that's not something that would happen if Intel didn't have something to say.

Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption
Review: The Oculus Quest 2 Could Be the Tipping Point for VR Mass Adoption

The Oculus Quest 2 is now available, and it's an improvement over the original in every way that matters. And yet, it's $100 less expensive than the last release. Having spent some time with the Quest 2, I believe we might look back on it as the headset that finally made VR accessible to mainstream consumers.