“Adaptive-Sync” is a widely used display technology implemented especially on gaming monitors. Companies and the gaming community often revere it as benefitting the overall gaming experience.

But, grasping the actual concept of the technology underneath the jargony layer of varying refresh rates, frame caps/ drops, input latency, stuttering,screen tearing, etc., is not so easy. So, whether it should be enabled or disabled remains perplexing for many gamers.

This article tries to address what exactly adaptive sync is, how it helps one’s gameplay, and whether it should be turned on or off.

adaptive-sync

What is Adaptive Sync?

Adaptive sync is a standard protocol for display technologies to synchronize with inputframe rates.It provides a royalty-free framework for displays to transition among a range of refresh rates as per the content displayed. The technology promises to overcome visual artifacts and helps to commercialize(also standardize) Variable Refresh Rates(VRR) in monitors.

Adaptive sync was originally developed and regulated by VESA (Visual Electronics Standards Association) and is provided for free to the member companies. And based upon it, companies like AMD, NVIDIA, etc., have upgraded the adaptive syncing tech towork with GPUs more optimally.

How Does Adaptive Sync Help Smoothen Visuals?

Before you decide whether to use adaptive sync or not, knowing insights on how Adaptive Sync synchronizes with display would be crucial. So, let us begin with the function of the display.

horizon-zero-dawn-screen-tearing

Something that’s being displayed on your screen is a set of moving or refreshing frames/pictures. The rate at which a monitor can refresh display frames is known as themonitor’s refresh rate. In other words, a 30hz refresh rate in a monitor means it can change 30 frames/pictures on its display in a second.

But, the content that a monitor display has its own frame rate. Videos might have a 60 fps playback speed, which means 60 frames are to be shown in a second. And for games, a GPU renders frames in real-time, which can highly vary depending upon a game, a scene within it, and the capacity of the GPU itself.

The difference in the monitor’srefresh rate and the content’s fpscreates various visual artifacts like screen tearing and stuttering. If fps is higher than the refresh rate, the monitor displays two frames at once, with a hard borderline cutout called screen tearing.

GPU-limiting-FPS

To cope with it, Nvidia createdV-Sync (Vertical Sync), which limits the GPU to match render speeds with the refresh rate. But it costs gamers in the form of increased input latency as the GPU won’t render any further frames exceeding refresh rates, regardless of user input. Then, V-sync was improved and succeeded as Fast sync by Nvidia to lower the input latency.

Synchronizing techs like V-sync and Fast sync would work well for videos and constant frame rates. But as mentioned earlier, the frame rates aren’t consistent at all while gaming. Simple or partially pre-rendered scenes would render swiftly. But, in the case of a 3d scene that consists of a higher number of polygons, the GPU will take time to render it.

As the FPS lowers, the monitor halts on some frames or displays a single frame multiple times to match the refresh rate, resulting in stuttering. And adaptive sync is the technology created to overcome these very varying problems. With its use, companies are able to make refresh rate-changing displays.

freesync-gsync

Now, a capable monitor can maintain a refresh rate per the rendering speed given by GPU and shift accordingly during variation. This helps avoid stuttering during lower fps and saves power by adapting minimal refresh rates when the visuals are stationary.