understand everything about variable cooling technologies

G-Sync, FreeSync, Adaptive Sync, V-Sync, VRR: you’ve probably heard of these screen refresh technologies. Developed for gamers or image professionals, they are of increasing interest to the general public. But why ? What do they really bring? We will explain everything to you.

Why develop such technologies?

To this question, the answer is often the same: needs. Over time, users acquire more and more efficient equipment. Also, the quality of the screens must follow, and not only at the level of the graphics quality. Smoothness and refresh rate are also among the top needs of users, especially gamers. Sometimes faced with the problems ofinput lag, of stuttering or even tearing, they turn to monitors or TV more and more sophisticated in order to use all their talent (and to avoid the passage of raging against the material).

To meet their expectations, the industry has therefore developed several technologies, often for the better, sometimes for the worse. And while many screens today are likely to live up to the demands of this audience, it is sometimes difficult to find one’s way in the midst of designations like V-Sync, G-Sync, FreeSync or Adaptive Sync.

V-Sync: the foot in the stirrup

As a first step, the industry chose to correct the problems of tearing (or screen tearing). Behind this term is the phenomenon of image tearing that occurs when the video stream is out of sync with the screen refresh rate.

In summary, the tearing appears when the graphics card and screen are not synchronized. A 90 Hz screen, for example, polls the GPU 90 times per second with a regularity as precise as a Swiss clock. The GPU, on the other hand, processes the volume of data more variably as it adapts to camera movements, visual effects, etc. The power of your computer is therefore taken into account in the fluidity of the display.


To get around this phenomenon, manufacturers of graphics cards have therefore developed V-Sync technology. Thanks to it, the GPUs synchronize with the screen refresh rate and thus avoid the problem of tearing.

While V-Sync seemed to establish itself as a silver bullet when it was launched, the technology quickly showed its limits. Indeed, if the tearing was ancient history, V-Sync did not fix issues withinput lag due to the GPU calculation delays nor that of the stuttering (micro-saccades displaying the same image several times and therefore reducing the impression of fluidity).

Nvidia and AMD then tried to correct the weaknesses of V-Sync, the first with Adaptive V-Sync and FastSync, the second with Enhanced Sync. Dissatisfied with the results, the two main manufacturers of graphics cards have launched a war to win the hearts of gamers.

G-Sync vs FreeSync: the war between Nvidia and AMD

In 2013, Nvidia and AMD have made the decision to abandon V-Sync technology. Each has developed their own solution, the first with G-Sync, the second with FreeSync. This is the birth of the variable refresh rate: now the screen adapts in real time to what the graphics card can produce. A historic turning point since the two giants have succeeded in imposing their technologies on manufacturers of monitors and TV. Well almost … Because if technically G-Sync and FreeSync seem to have everything a player expects, they still have one flaw: their cost.

G-Sync: Nvidia plays the premium card

With its market share in the graphics card segment (around 80%), Nvidia has opted for proprietary technology. Problem: G-Sync relies on a hardware solution that manufacturers of monitors and TVs must integrate into their devices. The high cost of integrating G-Sync was passed on to the final price of monitors and TVs (logical). For the consumer, the price difference between a FreeSync (AMD) monitor and G-Sync was then around 30%!

nvidia g sync

To justify this price, Nvidia has played the premium quality card. Indeed, not only does G-Sync certification require the integration of a proprietary module, but compatible devices must also pass a battery of tests (over 300) to get the Grail. In addition, the G-Sync module performs more tasks than the FreeSync to reduce motion blur (or ghosting) and optimizes FPS drops, a big weakness of AMD technology. In any case, at their beginnings (we will come back to this).

Despite the arguments put forward by Nvidia, monitor and TV makers haven’t really taken the bait. Also, it would be an understatement to say that the FreeSync monitor catalog was much larger than the catalog of G-Sync monitors. And this is explained by the technological choice of AMD.

FreeSync: AMD, the other school

Unlike Nvidia, AMD has opted for a solution that does not require additional hardware. FreeSync is based on a 2009 VESA Adaptive-Sync standard aimed at lowering energy consumption by reducing the display frequency if necessary. Although it uses Adaptive Sync technology (incorporated into the DisplayPort 1.2a standard in 2014), AMD has chosen to retain the FreeSync trade name for monitors and TVs compatible with Adaptive Sync. You follow ? All this to say that FreeSync only requires software support to work on any compatible display as long as you are using a Radeon GPU.

gsync freesync
Credit: MSI

At the time, this technical choice allowed AMD to offer display technology that met the needs of gamers while limiting manufacturing costs (although they were still higher than a conventional screen). But over time, specialists and users have recognized that G-Sync technology was more efficient than FreeSync, especially – as we said above – in the management of drops in FPS. While AMD subsequently announced support for LFC (Low Framerate Compensation) to correct this weakness, G-Sync was still a cut above. On the one hand, Nvidia therefore had the best technology, on the other, AMD offered a more accessible software solution that would democratize monitors and TVs of the genre.

G-Sync Compatible: the best of both worlds

After improving their technologies on their own (notably with the adoption of HDR with FreeSync 2 and G-Sync HDR), the two giants have found a compromise, under the leadership of Nvidia. In 2019, Company Announces G-Sync to Support VESA Adaptive-Sync Standard, which rhymes with the end of the obligation to integrate the G-Sync module in monitors and TV. A blessing for manufacturers since FreeSync displays can work with Nvidia graphics cards and therefore G-Sync.

gsync compatible

To indicate compatible devices, Nvidia has therefore launched the “G-Sync Compatible” certification.. The number of G-Sync screens then increased from 12 to 84 (G-Sync Compatible) since January 2019. However, the test batteries on FreeSync devices remain complicated. Nvidia’s specifications are very strict (flickering, ghosting, pulsing etc.) and the manufacturer imposes a VRR range (Variable Refresh Rate) of 2.4: 1 minimum. This gives manufacturers a hard time, but not to discourage it either, evidenced by the growing number of screens available on the market.

G-Sync vs FreeSync: the nomenclatures

AMD and Nvidia offer their G-Sync and FreeSync technologies in three “formulas”. So at Nvidia we find:

  • G-Sync: hardware solution. Manufacturers must comply with specifications requiring more than 300 tests. Models are factory calibrated.
  • G-Sync Ultimate: offers the most upscale of Nvidia. The displays take the strengths of G-Sync with the added bonus of HDR1000 compatibility, high refresh rate, excellent resolution, full color range like the DCI-P3 and other equally demanding criteria.
  • G-Sync Compatible: this is the solution from Nvidia that does not require the integration of the G-Sync module. To meet its criteria, Nvidia requires a battery of tests and a VRR range of at least 2.4: 1.

At AMD, FreeSync is also available in three versions:

  • FreeSync: guarantees a latency of less than 100 ms and eliminates tearing
  • FreeSync Premium: incorporates the requirements of FreeSync and adds LFC compatibility with Full HD displays with a refresh rate of 120 Hz minimum.
  • FreeSync Premium Pro: it is AMD’s most premium solution that replaces FreeSync 2 HDR. In addition to the requirements of FreeSync Premium, it requires a Full HD 120Hz panel, HDR400 compatibility and LFC compatibility as well as low latency in HDR and SDR.

There you are, if you’ve made it this far then you should see more clearly in this ocean of new technologies and standards. Perhaps you’ve already fallen for a FreeSync and / or G-Sync compatible monitor or TV. In this case, do not hesitate to give us your impressions.

Leave a Comment