Originally Posted by SeeThruHead
But when you drop below 30fps you don't just mysteriously lose variable refresh rate control. So while on a normal system under 30fps V-Sync turns off competely. On these monitors you can still have vsync at 25fs, by reducing the displays refresh rate to 50hz and displaying each image twice, not something you could do with a normal 60hz monitor.
Let me clarify here for you because I'm having a hard time telling exactly what you mean.
If your frame rate drops below the refresh rate, you can only display that with the next Vsync cycle. So on a 60hz display that's 16ms per Vsync. If your frame takes longer than that, your frame time doubles to 33ms per frame, even if it only took 17 ms to render. So now even if you can render 59fps you'll only get 30fps. If it takes longer than 33 ms to render it jumps to 50ms (3 refresh cycles) to display.... I.e. 20fps. If it takes longer to render than 50ms it jumps to 66ms (4 vsync cycles) which is 15fps.
If your frame rate drops below the refresh rate, Vsync is dynamically disabled and you'll get tearing but you get the benefit of avoiding the massive and discrete performance drops that I detailed above with regular Vsync. So dropping from 60fps to 59fps will run at 59fps with tearing instead of 30fps without tearing. If it hits 60ms again, it will re-enable Vsync.
The refresh is initiated at the exact moment that the frame is done rendering, instead of happening on a fixed schedule... As long as you are running at a frametime that is between...
A) The fastest a display can possibly refresh its pixels (I.e. Doing full refresh cycles immediately back to back)
B) The amount of time that LCD pixel technology can keep a pixel holding its color.
Let's use a 120hz display for ease of numbers. The refresh cycle of a 120hz display is around 8ms. So with gsync on, the highest frame rate you can get is 120fps, just like regular Vsync. The lowest that gsync can work its "rendered-frame-triggers-the-refresh-cycle" magic on is whatever the longest pixel technology can hold its color for per refresh... I think that's somewhere just north of 33ms.
So below ~30fps (that ~33ms pixel color limitation), the LCD pixels can't wait any longer and need to trigger the refresh. This "emergency" refresh is probably going to start as late as it can while still having no corruption in pixel color. That's because it wants to give the GPU as much time as possible to finish rendering and let the GPU be the one to trigger the refresh. Let's pick an "emergency pixel refresh deadline" value of 32ms (this may vary per LCD vendor). Let's say the frame is done at 36ms... You will have to wait for the full 8ms refresh time before you get a chance to show that frame. So if a frame is taking 36ms, it will be in the middle of finishing the emergency refresh when the frame is ready/rendered. As soon as the emergency refresh is done at 40ms, the finished frame will trigger another immediate refresh. This is exactly what regular Vsync would do... Effectively make the frametime take longer than it otherwise would because the rendered frame "missed the bus".
So below a certain fps it is bound to fallback to regular Vsync behavior, but this doesn't matter as long as the fallback point is below frame rates you'd willingly play at anyway.Edited by Seven7h - 10/19/13 at 10:17am