Originally Posted by ZealotKi11er
They said in the Video that when they develop games for Consoles since they know the hardware if there are a place in the game that causes fps drop they go and fix the problem. In PC fps drops are normal. I am still curious how this will work when you are the refresh rate. Simply saying it will act like V-Sync is horrible. 95% of games i play i dont drop below 60 fps 95% of the times @ 1440p.
If you are getting 100 fps in a 60 Hz monitor G-Sync does nothing because its usage is about the monitor waiting for the game. @ 100fps the game has to wait for the monitor hence input lag for the time you wait for the scan to happen if a frame takes like 10ms instead of 16ms to render. Could me more detailed then that and i could be wrong. They said V-Sync add ~20-80 ms input which is huge. Maybe the way they implement V-Sync is probably different.
If someone know or can explain feel free to post.
With GSYNC, if the game drops to 40fps, the monitor refresh rate will show exactly 40fps. it won't miss any frames due to timing, or need to double any to maintain that 60hz schedule.
With GYSN enabled, many viewers could barely tell the difference between 40-45fps to 60fps!
So even if your frame rate on the PC was to fall for a second or two (BF3 explosion nearby), you wouldn't notice it even if it went from 60fps to 40fps!
Digital Foundry has a great article about it Nvidia G-Sync: the end of screen-tear in PC gaming
The first demo took the form of a stone gazebo with a swinging pendulum in the middle. Zooming in on the pendulum we see some text. First up, we saw the demo operating at the optimal 60fps on both systems and the result was - as expected - identical. After that, the frame-rate was artificially lowered on the traditional system, first to 50fps, then to 40fps. The result was as expected: judder, caused by repeat frames being packed into the 60Hz time-frame. Then the same artificial frame-rate caps were introduced on the G-Sync system next door, the result being no perceptual difference in fluidity - the demo remained super-smooth.
After this, v-sync was disabled on the traditional system, causing annoying screen-tear at 50fps, and a cycling tear from top to bottom at 40fps that was simply unacceptable for anyone with even a passing interest in image integrity. The scene panned back, taking in the whole gazebo, after which the scene is spun, highlighting just bad screen-tear can be in fast-moving scenes with plenty of panning. Meanwhile, on the G-Sync side, the same frame-rate reductions caused no undue impact to fluidity at all. The same exercise was then repeated on both systems using the recent Tomb Raider reboot, the result being exactly the same: tearing and/or stutter on the traditional system, silky smooth fluidity with the G-Sync set-up.
Digital Foundry thought it was good enough to bring up quite a valid point. If GSYNC is as great is it sounds, and can make 45-50fps look and feel like 60fps, then what's going to keep the consumer from getting the 670 instead of a 680, or a 760 instead of a 670!
Pretty much saying that you can actually get away with less hardware due to this new feature....which is sort of a double edge sword