Originally Posted by frankietown
Originally Posted by Gunderman456
In addition to all that Vsink will also limit fps to a max of 60, so even if your CPU/Video Card(s) can push more frames, it will be limited to 60fps.
so would that mean that my gtx 680 along with a 3770k rig would be overkill? benchmarks for 1080p with all ultra and AA on show ~80fps for bf3. would it be beneficial for me to get 120hz for that or should i just stick with a 60hz and use vsync if it has screen tear? if its pushing more frames than what is needed, does that mean it is wasting the GPU? so would getting a 680 at this point be overkill then?
Originally Posted by axipher
On a basic level, what VSync does is keep your framerate that will create less screen tear. Essentially, it limits the number of frames rendered so that it lines up with your monitor refresh rate. So let's say your framerate is at about 47 FPS, VSync will limit it down to 45 too reduce screen tear, or if you get about 31 - 38 FPS, VSync would limit you down to 30 FPS.
You will notice a little less GPU usage at this point, but the screen tear reduction is normally worth it.
On the other hand though, VSync can hurt input lag in some games.
what is input lag?
In some game engines, input from the mouse, keyboard, other input devices, etc. are collected before generating the frame. Most games are pretty good for being able to separate the input processing to another thread separate from the frame rendering, but sometimes that's not perfect.
Having a lower frame rate creates a lower polling rate.
As a very basic example, someone with 120+ FPS compared to someone VSync locked at 30 FPS would see next to no input lag since the frames are being processed faster than the program is polling for input. On the other hand, someone with only 30 FPS might notice that their mouse and keyboard actions aren't being polled as fast as they would like because of VSync taking up a portion of the processing to extend the time it takes to render a frame so it lines up with a frame rate that is easily synced with the monitor.
To break it down just a little, again this is a really basic look at it, going from even 60 FPS to 30 FPS, you are going from a frame time of ~17 milliseconds to ~33 milliseconds. Now of course some of that is spent doing CPU stuff, then network stuff, GPU rendering, and of course, input processing. By reducing the FPS, you are increasing the time between input processing to a point that starts to become noticeable to some gamers.