Last Updated on January 9, 2020 by Husain Parvez
VSync is a feature that you can find in almost every single game out there. In fact, some titles can’t even work properly without it. One of the most popular examples is Skyrim – since its physics engine relies a lot on it. In that case, forcefully disabling VSync will make objects fly around.
Then again, most PC gamers will tell you that it’s something that you should always disable. What’s up with that? Well, if you want to learn more about it, feel free to keep on reading!
What is VSync?
To understand how VSync works, you must first learn about a few other things. Those are:
- Monitor refresh rates
- Screen tearing
FPS stands for Frames Per Second and refers to the number of frames that your PC can “push” per second which is what creates the illusion of motion.
Your monitor’s refresh rate, on the other hand, refers to how fast it refreshes its display every second. If your computer is pushing more frames than the monitor can handle, you get what’s called “Screen tearing”.
Screen tearing is when you only get a fraction of a frame on your display which is generally an unpleasant thing and it looks like this:
VSync, which stands for Vertical Sync, is a feature which fixes this issue by restricting the frame-rate down to your monitor’s refresh rate. For example, if the PC can push 120 FPS but the monitor is working at 75Hz, then VSync is going to restrict your FPS down to 75 – even if the PC is capable of performing much better than that.
So, TL;DR: VSync is a feature that can be found under a game’s graphical settings or under a GPU’s control panel (Nvidia Control Panel/AMD Radeon Software). Its purpose is to synchronize the game’s FPS with your monitor’s refresh rate with the purpose of eliminating screen tearing – and that’s more or less all there is to it.
Don’t be in a hurry to leave it enabled, though. VSync comes with both advantages and disadvantages. Read through them before making your decision.
Let’s start with the pros. First of all, as we mentioned above, VSync eliminates screen tearing. This is quite possibly the biggest advantage of enabling VSync as many people truly can’t stand screen tearing, which, by the way, is more evident on some GPUs and monitors than others.
Other than that, thanks to its different methodology of delivering frames, VSync can also make your gameplay to appear much smoother. And that’s especially true for those who are using 60 or 75Hz monitors.
Furthermore, since your graphics card doesn’t need to draw more frames than what the monitor displays, you’re also saving a bit of power. Thanks to that, the overall GPU temperatures should be lower – which is a great way to extend the overall lifespan of most PC components.
The very first thing that you’ll notice after enabling VSync is the input lag. Simply put, when this feature is enabled, every input from your controller, mouse, and keyboard will take longer to register on your display. While this isn’t a big deal for RPG games and anything like that, it can be a huge deal-breaker for competitive titles such as PUBG, Overwatch, or CS: GO.
The other disadvantage that this feature brings to the table is stuttering. When VSync is enabled, your computer needs to be able to display at least as many frames as the monitors does. If it misses even 1 frame, the game will end up stuttering quite a lot.
Other versions of VSync, such as adaptive VSync, can automatically enable or disable themselves depending on if your PC is keeping up with the monitor or not. So, if your GPU offers such options (Typically can be found under the GPU’s control panel/software), we’d highly recommend selecting them instead of the traditional VSync.
Depending on what GPU and monitor you own, there is also a chance that you’ll be able to use Freesync or GSync. Both of which are greatly superior. However, they do generally come at a higher price tag.
VSync can be used on pretty much any game that supports it. But, both Freesync and GSync only work under specific circumstances.
What is Freesync and GSync?
Freesync and GSync are sort of advanced VSync implementations that Nvidia and AMD offer. Freesync is what AMD uses while GSync is Nvidia’s offering. The main difference between the two is that GSync relies on hardware to work while Freesync is generally software-based.
The concept is still the same. The main purpose of these two features is to sync the GPU’s FPS output to the refresh rate of the monitor. However, both GSync and Freesync can communicate with the monitor and adjust its refresh rate on the fly!
This means that the monitor decreases or increases its framerate depending on the FPS that you’re getting. That’s how you get the advantages of VSync while avoiding big stutters as well. Not to mention that the input lag should generally be much lower.
GSync will only work with Nvidia cards and supported monitors. Freesync, on the other hand, works with both AMD and Nvidia cards. The only catch is that not all Nvidia GPUs support Freesync and those that do can only do so through Display Port at the moment (AMD cards have been able to work with HDMI as well for a while now).
There are some details that have changed over the years since this video was published. That’s why some of the info that we provide here is different.
Your monitor needs to be Freesync or GSync compatible as well. There used to be a time when only pricy displays would come with those. But, nowadays, even some of the cheapest offerings are at least Freesync enabled. GSync is definitely more expensive– but at the very least it’s now affordable.
That’s all for now. TL;DR:
- VSync: Limits the maximum framerate of a graphics card down to the refresh rate of a monitor but it also introduces input lag. A single missed frame will also introduce stuttering. VSync works with all monitors and graphics cards
- Freesync: AMD’s proprietary solution which allows a supported monitor to sync its refresh rate with the framerate of a graphics card. Works with AMD cards, supported monitors, and certain Nvidia cards over Display Port
- GSync: Nvidia’s proprietary solution which does the same thing. The main difference is that it relies on hardware on top of software while it’s strictly only compatible with Nvidia cards
If you’ve got any more questions, feel free to let us know about them in the comments section down below and the WTLS team will get back to you!
- Apple AirPods VS PowerBeats Pro: Which Should You Pick? - January 1, 2021
- Our Cyberpunk 2077 Rating: Could Have Been the Best Game - December 15, 2020
- 10 Best Zoom Lens for iPhone Devices to Get in Late 2021 - December 2, 2020