Overclock.net › Forums › Components › Monitors and Displays › G-Sync Monitor Decision
New Posts  All Forums:Forum Nav:

G-Sync Monitor Decision

post #1 of 13
Thread Starter 
Tremendously sorry if this is a repeat thread. A quick search of the forums and I couldn't find anything that matched my query. I am getting one of the new G-Sync monitors when they come out and am hung up on one single aspect of the decision. I currently have a GTX 780ti and I'm hung up between getting the Asus ROG Swift or the BenQ 27" G series. My apprehension is the 1440p resolution. I'm worried that with a single GTX 780ti I wouldn't get competetive framerates in current and upcoming games, but if I don't get the 1440p I may regret it coming soon when the standard begins to shift. The BenQ and ROG Swift will likely only be separated by $150 which isn't a concern to me if the Asus is truly the better monitor overall for my set up. I may add another GTX 780ti down the road but I prefer single card to SLi generally and don't want to bank on a new video card to make the choice since I just got the 780ti. Thoughts?
post #2 of 13
Get the Asus ROG.

I currently use a dell 30" and I game at 1600p. I do have sli 780ti's, but most games at this point Im only using 1 card of the 2. I do plan to get the same monitor as you are looking at and I suggested. But most likely I will get 3 monitors for surround view. ( the one concern is if I need a 3rd card) but you should be fine with the 780ti you have and 1440p with good to high settings on games.
post #3 of 13
I am in the same exact same boat. I'm about to build a gaming build with one 780ti and I am contemplating which g-sync monitor to buy to go with it.
The asus rog swift sounds like it's gonna be epic. But the 1440p resolution scares me as well. There is no way I can know before hand how much fps I can get at that resolution in modern games. I guess it will all depend on how well we will be able to tweak the graphic settings to keep the game running at either:
120fps stable for ULMB
or 144fps avg for gsync
while still keeping the game looking decent.

I have that feeling tho that a single 780ti will not cut it. Maybe a SLI of 780ti could.
And what if the game is CPU limited. Then I guess we're out of luck.

So yes, the BenQ one at 1080p sounds more reasonable.

But then we would miss the possibility of enjoying 1440p on older games.
Also Asus has teased with the overall quality of the panel itself which should be better than your regular gamer TN panel.
I doubt BenQ went that way, tho we are still waiting for more info.

Tough decision!
post #4 of 13
Thread Starter 
So maybe the best way to look at it is that you're paying a $150 premium for a better quality panel and future proofing? It really seems like the Asus is the way to go. Myself and most people I know have been using Asus monitors for some time for gaming and have had great experiences. Asus is one of three or four companies in the PC industry that I feel some sense of loyalty to. Corsair is number 1, EVGA is number 2, and XSPC has joined the list now.

The other thought I had is that in 4 months or so when the GTX 880 , or whatever they call it, comes out I will still likely be able to get 400-500 for the 780ti from people looking to SLi. That would enable me to buy a GTX 880 outright and the likelihood is that it will have between 6-8 GB of VRAM and be more than enough to cover 1440p gaming.

Thanks for the thoughts, I'll follow up on the post once the monitor is released and I make my purchse.
post #5 of 13
I'm sure that monitor will be beast.
But w/e you decide. Make sure you get the high frame rates. And if you use ULMB, you need fps = refresh rate. Or you will get stuttering (stuttering is a lot easier to detect on strobbed backlight lcd since it's not as much "hidden" by motion blurring. And if your fps drops in the 60's, you will start getting the dreaded double image effect which honestly is just as bad as sample and hold motion blur.
post #6 of 13
Thread Starter 
Hi Hasty,

Aren't those concerns exactly what G~Sync was made to correct?
post #7 of 13
Quote:
Originally Posted by finaljason823 View Post

Hi Hasty,

Aren't those concerns exactly what G~Sync was made to correct?

No. ULMB and G-Sync can't be active at the same time. It's either one or the other.
So if you want to have low motion blur, you need 120fps stable.
post #8 of 13
Thread Starter 
That's interesting, I wasn't aware of that. Which would be more favorable for gaming if you had to choose? My guess is G-Sync, but I'm not as familiar with ULMB.
post #9 of 13
Quote:
Originally Posted by finaljason823 View Post

That's interesting, I wasn't aware of that. Which would be more favorable for gaming if you had to choose? My guess is G-Sync, but I'm not as familiar with ULMB.

The recommendation is:

- ULMB => games you play competitively.
- G-sync => games you play casually and you want to push to the max in terms of graphical settings for the wow effect.

ULMB stands for Ultra Low Motion Blur. It's an official implementation of the lightboost hack. ( Lightboost hack info here: http://www.blurbusters.com/zero-motion-blur/lightboost-faq/ )

On modern sample and hold LCD, the motion blur perceived (when tracking a moving object /panning the camera in an FPS/scrolling in an RTS/ etc) can be calculated easily.

Image persistence (in milliseconds) = Motion blurring (in milliseconds)

(image persistence means the amount of time a static image is displayed)

For example,

60Hz without ULMB => image persistence = 16.7ms => 16.7ms of motion blur

120Hz without ULMB => image persistence = 8.3ms => 8.3ms of motion blur

144Hz without ULMB => image persistence = 6.9ms => 6.9ms of motion blur

Another interesting and simple calculation can be done to determine the length of the motion blurring trail.

When tracking a moving object at 1000 pixels per second, the lengths of the motion blur (in pixels) = the image persistence (in milliseconds)

For example,

60Hz without ULMB => image persistence = 16.7ms => 16.7 pixels of motion blur when moving at 1000 pixels/s

120Hz without ULMB => image persistence = 8.3ms => 8.3 pixels of motion blur when moving at 1000 pixels/s

144Hz without ULMB => image persistence = 6.9ms => 6.9 pixels of motion blur when moving at 1000 pixels/s

You can see how hitting higher refresh rates and frame rates helps to reduce the amount of motion blur.

But the true magic happens when you use strobbed backlights. When a display is strobbing the images, the persistence will only be equal to the amount of time the static image is flashed on the display.

In the case of the ULMB mode which is included in G-sync monitors (such as the Asus rog swift and the Benq G-series) the image persistence is 2 milliseconds.

If you remember the formula, Image persistence (in milliseconds) = Motion blurring (in milliseconds)

Therefore,

85Hz with ULMB => image persistence = 2ms => 2ms of motion blur => 2 pixels of motion blur when moving at 1000 pixels/s

100Hz with ULMB => image persistence = 2ms => 2ms of motion blur => 2 pixels of motion blur when moving at 1000 pixels/s

120Hz with ULMB => image persistence = 2ms => 2ms of motion blur => 2 pixels of motion blur when moving at 1000 pixels/s

This is quite the improvement in motion quality.


It should be noted that for an optimal experience with ULMB, it's highly recommended to have:

refresh rate = frame rate

If the frame rate is not matching the refresh rate, you will notice stuttering. (stuttering is very easy to notice in ULMB because it isn't 'hidden' by the motion blur)

And at low frame rate you'll start seeing one or several duplicates of the moving object you are tracking with your eyes.

For example:

60fps @ 120Hz with ULMB => You see two objects (Like if a ghost was following the object)

40fps @ 120Hz with ULMB => You see three objects

Ok. At this point, you will probably tell me:

"Well that's fine, I can set ULMB at 85Hz to play that demanding game that I manage to run at 85fps but I can't get to run at 100fps or 120fps."

And that would be a valid point.
But it must be noted that by its nature, ULMB makes the screen flicker.

Flickering becomes visible at refresh rates below the:

"Flicker fusion threshold"

(The Flicker fusion threshold means the threshold at which you stop noticing the flickering.)

Every individual has his own sensibility to flickering and therefore his own flicker fusion threshold.
It is possible that you would be fine with only 85Hz, but it's also possible that you would not be fine below 100Hz. Or you might need 120Hz to stop noticing the flickering. That's something you'll need to fiddle with.

In case 85Hz and 100Hz with ULMB gives you too much eyestrain,
the only optimal option you will be left out with for competitive gaming would be: outputting 120fps @120Hz with ULMB

Hopes that clears things out a bit for the ULMB part.




Now for G-Sync, what it does is synchronize the refresh rate to the frame rate (as long as it's between 30fps and 144fps)

So G-Sync should feel like what a good stable V-sync feel like(no stutter, no tearing).

The advantage of g-sync over v-sync is that it doesn't need stable frame rates to produce that effect. And if the frame rate is below 144fps, it won't add input lag like v-sync would.

Therefore g-sync is nice for playing a casual single player campaign at max graphical settings on a very demanding game.

I haven't got a g-sync monitor to test (yet) so I can't give you my subjective personal opinion on exactly how it feels. It's a very interesting and welcome feature. And is one of the reason I decided to go with an NVIDIA graphic card for my upcoming build.

But you must be aware that:
-in no way it's a substitute to getting high frame rate.
-it can't be used in ULMB low persistence mode.

For more reading about G-Sync:

-http://www.blurbusters.com/gsync/preview/ (overview of the advantages of g-sync)

-http://www.blurbusters.com/gsync/preview2/ (input lag measurement with g-sync)


If you need more help about displays, g-sync, input lag, motion blurring, ...
I highly recommend posting in the blurbuster forum: http://forums.blurbusters.com/
post #10 of 13
Asus ROG.

Look, you have the single most powerful graphics card in the world and you are "scared" you won't get competitive frame rates?

I game at native 1440p on a GTX 670 and get 60+ fps easily. It depends on the game though. Even Battlefield 4 on high will push 60 fps or more with a 780 Ti. And even if it drops below 60, why get GSync if you never gonna game below 60 fps? Seriously, this thread is bizarre.

Are you a professional gamer or just an amateur?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › G-Sync Monitor Decision