Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.
The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.
This is silly.
Gsync solved a problem that couldn’t be solved before they made it. They stayed committed to that good solution until there was an alternative that reached a reasonable level of performance, then supported both until they could get close without the expensive extra hardware.
Was it worth it? For most people no. But it’s still technically superior today and there are loads of options without the extra cost.
Take notes, Apple Metal!
They could literally just transition to Vulkan with a Metal wrapper for pre-existing software ate any time but no, they have to keep their ecosystem locked down for some reason.
VESA Adaptive-Sync goes back to the eDP stardard, 2009. AMD simply took that and said “Hey why aren’t we doing that over external DisplayPort”. And they did.
So instead of over-engineering a solution that nobody asked for to create vendor lock-in nobody (but fanboys with Stockholm Syndrome) want they exposed functionality that many many panels already had, anyway, because manufactures don’t use completely different control circuitry for laptop (eDP) and stand-alone monitors.
And, no, nvidia’s tech is not superior. From what I gather they have stricter certification requirements but that’s it.
Gsync modules have a lower sync window before LFC kicks in (usually around 30), and faster pixel response (overdrive) anywhere in the sync window. Those are benefits for both high framerate content and low framerate content.
Even today freesync usually bottoms out around 48. That constantly puts you at the LFC boundary for a lot of AAA games if youre on a popular midrange graphics card and aiming for 60fps average.
Just to address this from a high level, I see this as typical of Nvidia and AMD approaches. Nvidia makes something that’s engineered to perfection, but adds a bunch of requirements on it that make it expensive and supports vendor lock-in. Even if you’re willing to put with that to have The Best, you might hesitate when finding out what assholes Nvidia are about everything.
AMD then makes something 95% as good, and it’s cheap and you can work with them without yelling.
See also: FSR vs DLSS.
Is a problem that LFC is used? As it only duplicate frames.
https://www.amd.com/en/products/graphics/technologies/freesync.html
That constantly puts you at the point where you should lower graphics settings. Average fps might be a thing to put on benchmarks, but for actual playing you want to go by minimum fps (non-cutscene if necessary). And it’s not like Adaptive Sync can’t go down that low, protocol-wise, it’s that monitor producers don’t care to.
Overdrive, too, is a matter of implementation not the sync protocol.
The problem was solved by Nvidia, then AMD made it cheap and accessible and not requiring a dedicated hardware module.
For years and years Nvidia increased artificially by up to 150 euros many Gsync screens and for no legitimate reason. Initially there was NO compatibility with free sync at all.
Nvidia wasn’t kindly solving a gamers problem at least to after the first year of release of that tech. They were forcibly selling expensive hardware modules nobody needed or wanted. And long after freesync showed you could do it just as well without this expensive requirements.
This hardware module they insisted on selling wasn’t solving a technical problem but a money one.
I don’t even think anyone was ever able to differentiate between the different qualities of “sync techs”.
There absolutely was a legitimate reason. The hardwares was not capable of processing the signals. They didn’t use FPGAs on a whim. They did it because they were necessary to handle the signals properly.
And you just haven’t followed the tech if you think they were indistinguishable. Gsync has supported a much wider variance of frame times over its entire lifespan.