V Sync has been a thorn in many a gamers side for some time now, posing a genuine dilemma to gamers seeking the best in performance.

Unlike many other graphical options, it's not the same as just turning a slider up to max and getting the best, with vsync, screen-tearing and buffering causing problems to even the henchest of rigs. So what is vsync, and is it really as bad as it's sometimes made out to be...

Vsync Explained

So, what is Vsync? It stands for vertical synchronisation, and it’s effectively designed to get your graphics card and monitor working in time together. It should be ensuring they march to the beat of the same drum, rather than your GPU creating frames quicker than your monitor can display them.

Your monitor is always set a specific refresh rate, it doesn’t matter what your PC or GPU is doing at any one time, it will always be outputting the number of images relative to its refresh rate. Monitors refresh rate is measured in Hz (Hertz), which gives you the number of times it is updated every second. So, a 60Hz monitor would be capable of outputting a maximum of 60 frames per second.

It sounds great in theory, but of course it’s not without its caveats of course. Because vsync is tied directly to your monitor’s refresh rate, this puts a theoretical cap on your PC’s performance. Without vysnc enabled, your graphics card could be pumping out more images than your monitor can actually display at any one time, potentially leading to overlapping images - causing the effect known as screen-tearing.

As an example, let’s say you’ve got a 100 Hz monitor, and you’re playing a game that your GPU is outputting at a rock-solid 120 frames per second. Because the monitor is only capable of displaying a maximum of 100 frames in a second, the other twenty are redundant. The problem is that the GPU is still pumping out these images, meaning that during 1 in every 5 frames, two frames will be displayed at the same time.

The effect visually is a disjointed image, with certain points that have moved rapidly during the 1/100 of a second change appearing to have teared due to being created twice in a single frame. It can be anywhere on the screen, and it’s not a screen-wide occurrence, but for many gamers it can pretty off-putting. The more out of sync your monitor and GPU are, the worse the screen-tearing will be.

In below link you can see a basic example of how this looks...

http://imgur.com/jfvzdc1

So that explains why to have Vsync on, but why does it get so much hate? Take the red pill, ladies and gents, because we’re going deeper down the rabbit hole…


Double Buffering

Double buffering was a technique created decision to lessen the effects of screen-tearing. It basically creates and frame buffer and a back buffer - your monitor grabs the image from the frame buffer, while the graphics card copies the next image from the back buffer to the frame buffer, and creates a new image for the back buffer. Think of it like a daisy chain from image creation to image display, ensuring each step is finished before moving onto the next. If the copying isn’t quick enough there can still be torn images, but it’s still an improvement.

The downside though, is that double-buffered vsync can only happen at set values of the refresh rate. For example, on a 120 Hz monitor, double-buffered vsync can output at 120fps, 60 (½ 120), 40 (⅓ 120), 30 (¼ 120), 24 (⅕ 120) etc. Imagine that you’re playing a game and it’s usually hitting about 70 fps on your rig, meaning that double-buffered vsync automatically ensures it outputs at 60 frames per second. If you hit a graphically intense section that drops your fps down to 59, it will automatically drop all the way down to the next available integer, 40 fps. What was a simple and barely noticeable 15% frame rate drop suddenly becomes nearly 50%, creating a massive lurch in frame rates.

To Vsync or not Vsync, that is the question

What you’re looking at with vsync is a choice, a personal preference that comes entirely down to you. For many, switching vsync on will make for a noticeably prettier game, particularly those afflicted few that can spot an image tear from 20 paces, blindfolded. Unfortunately, it’s these very same people that are likely to notice screen-tearing that will notice the framerate hitches when vsync is turned on. While double or even triple-buffering can help greatly reduce this effect, for many the huge percentage hits framerates make it a road not worth travelling.

What it's crucial to remember though, is that no matter how fast your GPU can pump out images, it's always going to be limited by the tech in your monitor. As long as your GPU can create these images faster than your monitor can display them, then you're not going to lose out in any way from vsync limiting your framerate.

So, do you gamers turn on Vsync or would you rather have your GPU working to the limit?

Does screen tearing bother you, or can you tell the difference when the framerate drops for that matter?

Let us discuss