Overclock.net banner

1 - 20 of 51 Posts

·
Old School
Joined
·
1,636 Posts
Discussion Starter #1
So, my boyfriend was one of the lucky few to win a DIY G-Sync kit. We put it all together last night, and have spent the better part of today playing with it. First off, a gallery of some of the pictures I took while installing:

Unboxing and install gallery

Some observations first: the kit came with a display port cable, this was unexpected and we had already purchased one, but hey, I can always use more cables. The manual is very detailed, and you can tell they mean for this to go into a more widespread production rather than to be something handed out to only a handful of people - I expected just a link to a website with instructions. There are some noticeable differences between the pcb I received and the one inside the review monitor that Andantech received, but nothing TOO major. Both pcbs appear to have a location for an HDMI port, though neither has the port soldered in. The LVDS cables mount 180 degrees from the original PCB, the manual notes this but I wasn't exactly reading the manual until I got to that point and saw it wasn't quite the same. This was rather unexpected, there is SO MUCH empty room on the pcb they easily could've mounted the connectors the same direction as the original. I'd really like to see the inside of the BenQ monitor that uses the same panel, as I think this kit could be used on that as well, but I don't own one to take apart.

Now, obviously since we scrap the entire pcb for the ASUS monitor, we lose everything associated with it. That means all of the OSD, including things like the crosshair overlays. The G-Sync OSD is very minimal, with just an info panel, brightness control, and button to enable/disable ULMB (the improved lightboost mode). ULMB seemed to look much less washed out than the hacked lightboost modes we ran before, however there's really no options for it, it's either on or off. ULMB won't work if you're in a g-sync mode, either, though hopefully that changes in the future.

There does look to be a jtag port (which wouldn't surprise me with the hardware they're using), and I fully intend to hook up my usbjtagnt to it and see if I can dump anything meaningful, but that'll have to wait until the shiny wears off and he'll let me tear it apart again.

As for actual performance, I'm impressed, and I was a bit skeptical at first. He was off playing Max Payne 3, and called me over to check it out. He's running a single gtx680, and normally plays with some of the settings lowered so he can come close to 100-120fps. I came over and couldn't believe how smooth everything looked, and then he pulled up evga precision on his phone, showing me it was running between 50 and 70 fps. Without seeing it in person, I wouldn't have believe it could've made that large of difference. Now, on games where he held 120 fps easily no matter what, it made virtually no difference. Our general conclusion was if you hold 120 fps, stick with the ULMB mode, but if you fluctuate around 60, G-sync looks amazing.

Anyone else get their DIY kits in yet and want to chime in? Anyone with questions?
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #3
I certainly can't seem to find them, it seems like they intend for you to use the nvidia control panel for all of this.
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #5
I think it's safe to assume that the retail monitors with G-Sync support will have proper OSDs with full configuration options, and likely inputs like DVI/HDMI that ignore G-Sync. While it seems like they plan to distribute this kit pretty widely, it very much seems like a beta. Since it's controlled by an FPGA, it's possible in theory for nvidia to make updates to it as time goes on, as well. These really are just the first steps in this direction, and I can't imagine future monitors ignoring the typical adjustments that everyone has come to expect.
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #7
EVGA precision x has an android app that can monitor many things in realtime, including fps.
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #10
The brick doesn't actually come with a plug for the wall, you reuse the monitor's old cable to the brick, and that to the gsync.
 

·
Registered
Joined
·
223 Posts
huuuge +rep especially for the:
Quote:
ULMB seemed to look much less washed out than the hacked lightboost modes we ran before
could you chance how g-sync affects game performance? i hope it is minimal, but would be good to know
 

·
Registered
Joined
·
20 Posts
Would it be safe to imagine that you can calibrate the display port with the asus osd before replacing it with the gsync module? I definitely like my setting at the moment with it being at d65k. Or does replacing it with the gsync module make all prior calibrations moot. At least I can swap if I feel the need to cal. Not a fan of software controls since they don't hold in game in full screen. At least when it's an ICc type profile. I'm enthusiastic about this product but it is always something that curbs my eagerness.
 

·
Registered
Joined
·
1,255 Posts
Quote:
Originally Posted by crun View Post

huuuge +rep especially for the:
could you chance how g-sync affects game performance? i hope it is minimal, but would be good to know
I have a G-SYNC monitor with ULMB sitting here, too. (due to the Blur Busters preview).

As the cat is now out of the bag, I can finally begin talking about ULMB (Ultra Low Motion Blur).

Framerate performance is unaffected by strobe backlights. The LightBoost interference with performance only occured with non-ToastyX approaches due to driver...

120Hz non-ULMB versus 120Hz ULMB has the same framerate performance. That said, with all strobe backlights, the motion clarity means tearing/stutters can become more visible, so you want to reach framerate=stroberate.

ULMB works at 85Hz, 100Hz and 120Hz. For the least GPU power while getting zero motion blur, you can try [email protected] for the full CRT effect. Even if you prefer VSYNC OFF during competitive play, remember to test VSYNC ON, with a strobe backlight, due to stellar motion performance at framerate=stroberate.

Motion quality (motion smoothness/clarity) goes as follows:
#1: Strobed (ULMB, LightBoost, Turbo240) with VSYNC ON if no framedrops, stroberate==framerate
#2: G-SYNC at 40fps-144fps, for anything that tends to drop in framerates
#3: Strobed (ULMB, LightBoost, Turbo240) with VSYNC OFF, stroberate != framerate,

Especially during strobed (ULMB, LightBoost, Turbo240), the higher framerate, the better looking VSYNC OFF looks (even at framerate > stroberate -- e.g. [email protected] strobed looks noticeably better than [email protected] strobed, because of smaller-offset tearlines and less microstutter caused by framerate-vs-refreshrate aliasing. Microstutter errors go down the higher the framerate you go. 150fps has a time-basis error of 1/150sec, while 300fps has a time-basis error of only 1/300sec)

However, nothing beats framerate==stroberate==refreshrate strobed, so VSYNC ON always visually looks the best (even if not always "feels" the best with mouse, due to a few milliseconds extra lag), if you are guaranteed to never have a frame drop.

Which means Strobed VSYNC ON (perfect full framerate) is better than G-SYNC, but G-SYNC is better than Strobed VSYNC OFF (variable framerate). The problem with approach #1 is VSYNC ON which adds input lag, but VSYNC ON during perfect [email protected] makes motion look amazing (no stutter, no tearing, no motion blur), but is almost impossible in Crysis3 and Battlefield4, but very easy to achieve in older games such as Counterstrike, and very possible to achieve in games like Borderlands2 and Bioshock Infinite (with a Titan, slight setting adjustment).

-- For competitive gameplay, #2 is usually best (GSYNC) because it has nearly no input lag difference with VSYNC OFF, and it can eliminate erratic stutters during erratic framerates (the most impressive attribute of GSYNC)
-- For solo gameplay or if you are a true super-duper motion fluidity nut, #1 is usually best if you have a powerful GPU (VSYNC ON during ULMB, LightBoost, Turbo240) if you got a GPU that can maintain a perfect sync of stroberate==framerate==refreshrate at all times ([email protected], [email protected], or [email protected]).

The improved motion clarity of strobed (ULMB, LightBoost, Turbo240, BENQ Blur Reduction) makes the human threshold detectability of microstutters MUCH higher. During strobed modes, humans can detect microstutters that are caused by time-basis errors of only a few milliseconds -- fast, rapid microstutters manifest itself indirectly as motion blur (because microstutters vibrate very fast, it creates a minor amount of motion blur). With VSYNC ON framerate==stroberate==refreshrate, you have zero microstutters, and you thusly gain the best zero motion blur effect. A 1ms of time-basis error during 1000 pixels/second can mis-position objects by 1 pixels (and if this happens rapidly enough, vibrating, this creates 1 pixel of extra motion blurring). So, a 1 millisecond time-basis error -- a 1 millisecond microstutter -- can still be human noticeable if microstutters are continually occuring to the point where all edges are vibrating fast enough to create additional motion blur.

It's amazing how sensitive human eyes are, especially during the extreme motion clarity within the strobed modes of these several new strobe backlights that have finally hit the market (LightBoost, ULMB, Turbo240, and BENQ Blur Reduction).
 

·
Registered
Joined
·
189 Posts
Quote:
Originally Posted by bakageta View Post

So, my boyfriend [...]
Your boyfriend dreamss?
wink.gif
How's the input lag for G-Sync and ULMB? Is there tearing without V-Sync when framerates go above the refresh rate with ULMB? If it does, I supose that capping the framerate on the driver-level doesn't remove tearing with ULMB?
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #18
Quote:
Originally Posted by mdrejhon View Post

Motion quality (motion smoothness/clarity) goes as follows:
#1: Strobed (ULMB, LightBoost, Turbo240) with VSYNC ON if no framedrops, stroberate==framerate
#2: G-SYNC at 40fps-144fps, for anything that tends to drop in framerates
#3: Strobed (ULMB, LightBoost, Turbo240) with VSYNC OFF, stroberate != framerate,

-- For competitive gameplay, #2 is usually best (GSYNC) because it has nearly no input lag difference with VSYNC OFF, and it can eliminate erratic stutters during erratic framerates (the most impressive attribute of GSYNC)
-- For solo gameplay or if you are a true super-duper motion fluidity nut, #1 is usually best if you have a powerful GPU (VSYNC ON during ULMB, LightBoost, Turbo240) if you got a GPU that can maintain a perfect sync of stroberate==framerate==refreshrate at all times ([email protected], [email protected], or [email protected]).
For my gaming, G-sync wins out pretty easily, though I do miss the low persistence of a strobbed backlight and can't wait until the two work together. ULMB comes on the instant I'm not gaming, the low persistence is excellent for the actual work I do. I normally can't stand to play games under 60 fps, and can see a difference to 75+ still pretty easily, which means turning down setting in high end games. With G-sync, I can crank the settings up and 45 fps looks as smooth or better. I tend to play quite a bit of rhythm and fighting games, and even tiny input lag is brutal here, so that usually rules out vsync.

My biggest excitement for G-sync is that you can get a smooth gaming experience without needing a top-end setup. I'm a budget gamer at heart - I grew up poor but even now that I can afford most things I want I find myself constantly sticking with the best bang for the buck and holding onto hardware as long as it performs reasonable well. I still rock a pair of gtx460s because they still pack a respectable punch, but they're showing their age now. I'm thrilled that I won't need to go huge with the next upgrade.

Sure, the monitor will be a bit pricey, but my 2 displays are both 5+ years old and I have no problems replacing capacitors as monitors age and die (old school arcade tech, can't count how many cap jobs I've done). I've been wanting 120hz+ ever since dreamss got his and I got to play with it, and G-sync is the tipping point for me.
Quote:
Originally Posted by arsn View Post

Your boyfriend dreamss?
wink.gif
How's the input lag for G-Sync and ULMB? Is there tearing without V-Sync when framerates go above the refresh rate with ULMB? If it does, I supose that capping the framerate on the driver-level doesn't remove tearing with ULMB?
Yeah, he had the hardware to enter the contests, I'm still running 460's in sli and a mediocre display that I clock up to 80hz.Zero input lag with ULMB, compared just by disabling it as I didn't feel like taking the kit out to compare to stock Asus internals. G-sync does seem to have a TINY input lag, but nothing like vsync. It's less input lag than several cheap LCDs I've owned and I consider it a non-issue. Going over the refresh rate with ULMB does result in tearing, but the higher you go, the smaller the tears get. It's the same effect you can see on any lcd, I just find that strobed backlights make it easier to notice - I've always seen tearing fairly easily, but dreamss had never noticed it until he enabled lightboost.
Quote:
Originally Posted by crun View Post

huuuge +rep especially for the:
could you chance how g-sync affects game performance? i hope it is minimal, but would be good to know
G-sync has very minimal impact on performance, I haven't taken exact measurements yet, but from a casual glance there's almost no performance hit, if any. It's hard to pry dreamss away for a few hours of benchmarking, but we do have a second gtx680 on the way and he assure me I can bench away when it gets here. I want to do some test at higher fps, to see how much benefit G-sync is to those with overkill systems, and see it's effect on sli microstutter. I'll do both single card and sli runs with and without G-sync, ULMB, and vsync to offer up some actual numbers as well as my opinions on the visual aspect.
Quote:
Originally Posted by TrevJonez View Post

it uses blue tooth to get the info/control to/from the pc.
The latest version has wifi support, which is MUCH more convenient. You need to make a slight tweak on the pc side of things, but it's an official EVGA tweak and has pretty good instructions on their fourms.

Now, on another topic...

Speaking of proper testing, does anyone have requests? I've got a pretty respectable steam library here, as well as dreamss library here. I'm partuclarly interested in older title and Indie games, but the usual suspects will make appearances. See something obscure that I don't have? Message me and I'll likely pick it up.

Also, sorry for the horrendous typos, I'm on my phone in bed.
 

·
Old School
Joined
·
1,636 Posts
Discussion Starter #20
Quote:
Originally Posted by ispano View Post

Would it be safe to imagine that you can calibrate the display port with the asus osd before replacing it with the gsync module? I definitely like my setting at the moment with it being at d65k. Or does replacing it with the gsync module make all prior calibrations moot. At least I can swap if I feel the need to cal. Not a fan of software controls since they don't hold in game in full screen. At least when it's an ICc type profile. I'm enthusiastic about this product but it is always something that curbs my eagerness.
Well, in theory DP supports MCCS, which can take the place of an osd. I'll look into it more.
 
1 - 20 of 51 Posts
Top