Overclock.net › Forums › Components › Monitors and Displays › [ASUS] Update: 27" 1440P GSYNC 120Hz RoG Monitor at CES
New Posts  All Forums:Forum Nav:

[ASUS] Update: 27" 1440P GSYNC 120Hz RoG Monitor at CES - Page 106

post #1051 of 4310
Quote:
Originally Posted by Heracles View Post

So ULMB vs G-Sync with these monitors at 7680x1440 - which will be more beneficial. I'm assuming ULMB as long as you keep the FPS around or above 120hz?
Not necessarily.
You need framerate matching refreshrate matching stroberate for the sweet spot in motion clarity.

So 85fps if you use 85Hz ULMB
So 100fps if you use 100Hz ULMB
So 120fps if you use 120Hz ULMB

And then use either a fps_max setting, the VSYNC ON setting, or the Adaptive VSYNC setting, in order to set the frame rate to match reasonably closely to the frame rate, if you want the sharpest possible LightBoost effect.
Edited by mdrejhon - 1/15/14 at 2:02pm
post #1052 of 4310
Quote:
Originally Posted by SeeThruHead View Post

LIghtboost is for games that can reach a consistent 120fps.
Not necessarily. You simply need framerate == refreshrate == stroberate for the magic point of maximized motion clarity. Same reason why CRT 60fps @ 60Hz has had less motion blur than old-fashioned non-strobe LCD 120fps@120Hz.

Strobe backights ARE now available at lower strobe rates.

100Hz LightBoost strobing -> Use 100fps
120Hz LightBoost strobing -> Use 120fps
85Hz ULMB strobing -> Use 85fps
100Hz ULMB strobing -> Use 100fps
120Hz ULMB strobing -> Use 120fps
75Hz BENQ Z-series Blur Reduction strobing -> Use 75fps
96Hz BENQ Z-series Blur Reduction strobing -> Use 96Hz
144Hz BENQ Z-series Blur Reduction strobing -> Use 144Hz
(Note: Need XL2720Z firmware upgrade BENQ strobing bugs; recently reported on Blur Busters Forums).

Panning motion tests (e.g. www.testufo.com/photo) automatically run at framerate==refreshrate, so they always look amazingly crisp and sharp on a CRT and on strobe backlights. You do not require 120 frames per second, it's only simply because of a LightBoost vendor limitation (like a fixed frequency CRT that only runs at 120Hz and not 60Hz). New strobe backlights have more flexible refresh rates.

That said, lower strobe rates are more flickery, though. You adjust the refresh rate for a compromise setting. Low enough to be easy on the GPU, high enough to avoid flicker/lag. If you don't have a Titan, then the sweet spot is often around 85Hz, as has often been found in the old CRT days.

________

In situations that frequently drop below the refresh rate minimums of a specific strobe backlight, G-SYNC is vastly superior. It provides greatly superior experience during the low frame rates, eliminating stutters and tearing so well in advanced games such as Crysis 3 -- as stutterfree 45fps often looks better than very-stuttery/teary 75fps. However, you do have to live with the motion blur, which sometimes bothers me quite a bit.

I also think that variable-rate strobing, that gradually disables itself as the framerate falls below rates below flicker fusion thresholds, would be a very reasonable solution to the variable-rate strobing problem.
Edited by mdrejhon - 1/15/14 at 2:07pm
post #1053 of 4310
Quote:
Originally Posted by mdrejhon View Post

Not necessarily.
You need framerate matching refreshrate matching stroberate for the sweet spot in motion clarity.

So 85fps if you use 85Hz ULMB
So 100fps if you use 100Hz ULMB
So 120fps if you use 120Hz ULMB

And then use either a fps_max setting, the VSYNC ON setting, or the Adaptive VSYNC setting, in order to set the frame rate to match reasonably closely to the frame rate, if you want the sharpest possible LightBoost effect.

Hmmm but Vsync adds input lag.... why oh why couldn't G sync supported ULMB and GSync at the same time and work in surround :/
post #1054 of 4310
I hope there's some magical dudes who can fix it, but strobing on variable refresh rate seems like an understandable challenge. It's still better than nothing to be able to choose between one awesome thing and another depending on how well you can run the game and if the engine sucks or not
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
4770k z87x-ud3h Gigabyte Windforce 770 Samsung Green 2x4gb 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Crucial c300 Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus VG248QE WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Deathadder 3.5g Qck+ 
  hide details  
Reply
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
4770k z87x-ud3h Gigabyte Windforce 770 Samsung Green 2x4gb 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Crucial c300 Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus VG248QE WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Deathadder 3.5g Qck+ 
  hide details  
Reply
post #1055 of 4310
Quote:
Originally Posted by Cyro999 View Post

I hope there's some magical dudes who can fix it, but strobing on variable refresh rate seems like an understandable challenge.

Wat
     
CPUMotherboardGraphicsRAM
Core i7 920 3.8Ghz Asus Rampage II Extreme Asus GTX470 SLI 12GB Corsair Dominator 1600Mhz 
Hard DriveOSMonitorPower
3x 300GB Velociraptors RAID 0, WD 1TB, Hitachi 2TB Windows 7 x64 Pro 52" and 32" Samsung 1080p LCD TVs Corsair HX850 
CaseMouse
Cooler Master Cosmos 1000 Razer Imperator 
  hide details  
Reply
     
CPUMotherboardGraphicsRAM
Core i7 920 3.8Ghz Asus Rampage II Extreme Asus GTX470 SLI 12GB Corsair Dominator 1600Mhz 
Hard DriveOSMonitorPower
3x 300GB Velociraptors RAID 0, WD 1TB, Hitachi 2TB Windows 7 x64 Pro 52" and 32" Samsung 1080p LCD TVs Corsair HX850 
CaseMouse
Cooler Master Cosmos 1000 Razer Imperator 
  hide details  
Reply
post #1056 of 4310
Quote:
Originally Posted by littledonny View Post

Wat

If you have static 120hz refresh rate, you can just strobe the backlight for 1.4ms every 8.34ms (120x per second)

If you have g-sync, then refresh rate is variable. If you have a performance dip, your screen won't refresh for 33.3ms. When do you strobe the backlight? So what, if you dip to 30fps you wait 4x as long between strobes? 1/4 of the brightness level on screen if you keep the same strobe lengh, etc. Obviously, a brightness level on screen proportional to FPS with at best noticable flicker whenever your FPS drops is not a great solution, it's pretty laughably bad
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
4770k z87x-ud3h Gigabyte Windforce 770 Samsung Green 2x4gb 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Crucial c300 Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus VG248QE WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Deathadder 3.5g Qck+ 
  hide details  
Reply
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
4770k z87x-ud3h Gigabyte Windforce 770 Samsung Green 2x4gb 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Crucial c300 Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus VG248QE WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Deathadder 3.5g Qck+ 
  hide details  
Reply
post #1057 of 4310
Quote:
Originally Posted by mdrejhon View Post

Not necessarily. You simply need framerate == refreshrate == stroberate for the magic point of maximized motion clarity. Same reason why CRT 60fps @ 60Hz has had less motion blur than old-fashioned non-strobe LCD 120fps@120Hz.

Strobe backights ARE now available at lower strobe rates.

100Hz LightBoost strobing -> Use 100fps
120Hz LightBoost strobing -> Use 120fps
85Hz ULMB strobing -> Use 85fps
100Hz ULMB strobing -> Use 100fps
120Hz ULMB strobing -> Use 120fps
75Hz BENQ Z-series Blur Reduction strobing -> Use 75fps
96Hz BENQ Z-series Blur Reduction strobing -> Use 96Hz
144Hz BENQ Z-series Blur Reduction strobing -> Use 144Hz
(Note: Need XL2720Z firmware upgrade BENQ strobing bugs; recently reported on Blur Busters Forums).

Panning motion tests (e.g. www.testufo.com/photo) automatically run at framerate==refreshrate, so they always look amazingly crisp and sharp on a CRT and on strobe backlights. You do not require 120 frames per second, it's only simply because of a LightBoost vendor limitation (like a fixed frequency CRT that only runs at 120Hz and not 60Hz). New strobe backlights have more flexible refresh rates.

That said, lower strobe rates are more flickery, though. You adjust the refresh rate for a compromise setting. Low enough to be easy on the GPU, high enough to avoid flicker/lag. If you don't have a Titan, then the sweet spot is often around 85Hz, as has often been found in the old CRT days.

________

In situations that frequently drop below the refresh rate minimums of a specific strobe backlight, G-SYNC is vastly superior. It provides greatly superior experience during the low frame rates, eliminating stutters and tearing so well in advanced games such as Crysis 3 -- as stutterfree 45fps often looks better than very-stuttery/teary 75fps. However, you do have to live with the motion blur, which sometimes bothers me quite a bit.

I also think that variable-rate strobing, that gradually disables itself as the framerate falls below rates below flicker fusion thresholds, would be a very reasonable solution to the variable-rate strobing problem.

That's actually great compromise. Didn't know you could set a g-sync monitors refresh rate while in ulmb mode. Still the main reason I'm interested in g-sync is that it removes the need to aim your gpu setup at a Minimum fps. You can now aim for an average.

For myself almost every single second I spend gaming falls into the second scenario you mention. Even with sli 780's at 1400mhz that would sill be the case. So what happens to me is I spend hours fiddling with settings instead of playing in order to reach the highest graphical settings that will enable a minimum of 60fps. With G-Sync I no longer ever have to worry about doing that. Because minimum refresh rate is no longer significant.

Another thing to note as that at 1440 a minimum fps of 85 fps would likely require two or three 780's or greater. My guess is probably 3 if you wanted to turn the settings all the way up. Which is still the same problem that lightboost strobing has always had.

Would it be possible for the monitor to receive the amount of time it took the gpu to render the frame and then adjust the length of the strobe accordingly? To maintain a consistent brightness?
Edited by SeeThruHead - 1/15/14 at 5:40pm
NanoBench
(21 items)
 
STHTV
(8 items)
 
CPUMotherboardGraphicsRAM
2600k maximus iv gene Gtx 780 Classified G.Skill Trident X 
Hard DriveCoolingCoolingCooling
adata sx900 Watercool MO-RA3 420 bitspower tank Swiftech MCP35x2 
CoolingCoolingCoolingCooling
EK-Ram Monarch x4 Clean Ek MIV-Gene Fullboard EK 780 Classified VGA Cooler EK Supremacy Clean 
CoolingMonitorKeyboardPower
Aquaero 5 LT Crossover 1440p QuickFire Tenkeyless Seasonic x-750 
CaseMouseMouse PadAudio
Dimastech Nano g700 Mionix Propus 380 Onkyo HT-R290 
Other
MDPC Sleeve 
CPUMotherboardGraphicsRAM
2500k asrock z77e-itx gtx 650 ti boost gskill ripjaws x 
Hard DriveOSMonitorCase
adata sx900 256 antec khuler Sony Bravia silverstone sg05 
  hide details  
Reply
NanoBench
(21 items)
 
STHTV
(8 items)
 
CPUMotherboardGraphicsRAM
2600k maximus iv gene Gtx 780 Classified G.Skill Trident X 
Hard DriveCoolingCoolingCooling
adata sx900 Watercool MO-RA3 420 bitspower tank Swiftech MCP35x2 
CoolingCoolingCoolingCooling
EK-Ram Monarch x4 Clean Ek MIV-Gene Fullboard EK 780 Classified VGA Cooler EK Supremacy Clean 
CoolingMonitorKeyboardPower
Aquaero 5 LT Crossover 1440p QuickFire Tenkeyless Seasonic x-750 
CaseMouseMouse PadAudio
Dimastech Nano g700 Mionix Propus 380 Onkyo HT-R290 
Other
MDPC Sleeve 
CPUMotherboardGraphicsRAM
2500k asrock z77e-itx gtx 650 ti boost gskill ripjaws x 
Hard DriveOSMonitorCase
adata sx900 256 antec khuler Sony Bravia silverstone sg05 
  hide details  
Reply
post #1058 of 4310
Quote:
Originally Posted by Heracles View Post

Hmmm but Vsync adds input lag....
I did say you can use either a fps_max setting OR the VSYNC ON setting OR the Adaptive VSYNC setting.

For example, you can use VSYNC OFF, and use the fps_max setting.

Also, don't forget that strobing sometimes adds more input lag than fully framerate-locked VSYNC ON (no framerate slowdowns) does. Also, VSYNC ON isn't necessarily _always_ evil -- we had Super Mario Brothers (VSYNC ON) and Mortal Kombat (VSYNC ON) -- Nintendo games were always VSYNC ON back in the 1980's -- yet we never complained about input lag on those consoles. VSYNC ON only became evil with 3D gaming because of framebuffers, and the way that things are architectured nowadays. Some people play at fps_max 59 during VSYNC ON 60Hz, or use high framerate triple-buffered VSYNC ON, to minimize input lag in Counterstrike to almost the same as VSYNC OFF. It's funny how VSYNC ON got a bad reputation, when it's really not that evil of a best, if the beast is properly understood & tamed.

Meanwhile, GSYNC looks like VSYNC ON, but behaves as VSYNC OFF in input lag -- I'm glad the beautiful VSYNC ON look-and-feel is coming back (in a reborn variant called GSYNC), without the input lag and stutter issues. A few days ago, I measured the input lag of G-SYNC for that recent blur busters article (preview #2), and found nearly zilch difference with VSYNC OFF, especially if you don't bang against the GSYNC frame cap...
Edited by mdrejhon - 1/15/14 at 8:06pm
post #1059 of 4310
Uh.....all this talks makes me want to just find an old CRT and get back to using it for gaming. I want less motion blur than what I currently have, and this display will offer that, but then I won't be using G-Sync, so why pay for it?

I guess it would be nice if they offered a version of this display without G-Sync, slightly lower cost to compensate, I wouldn't have much issue maintaining 85+ FPS with two 780 Tis. For some reason in my excitement over this display I thought they managed to get a variable stobe rate in with it all.

I kind of wish someone would start making CRTs again and selling them, I would imagine with current tech and materials they could get the footprint down a little bit from the old days.
Edited by PostalTwinkie - 1/15/14 at 8:12pm
Scavenged
(18 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 2500K @ 4.6Ghz MSi Z77 MPower Crossfire 7970  8GB Patriot @ 1600Mhz 
Hard DriveHard DriveOptical DriveCooling
Mushkin Enhanced Chronos 128GB SSD Seagate Barricuda 250G 7200RPM LG DVD-RW & Lite-On DVD-ROM Noctua NDH 
OSMonitorMonitorKeyboard
Windows 7 Home Premium 64-bit Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 NZXT Phantom Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
Scavenged
(18 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 2500K @ 4.6Ghz MSi Z77 MPower Crossfire 7970  8GB Patriot @ 1600Mhz 
Hard DriveHard DriveOptical DriveCooling
Mushkin Enhanced Chronos 128GB SSD Seagate Barricuda 250G 7200RPM LG DVD-RW & Lite-On DVD-ROM Noctua NDH 
OSMonitorMonitorKeyboard
Windows 7 Home Premium 64-bit Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 NZXT Phantom Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #1060 of 4310
Quote:
Originally Posted by SeeThruHead View Post

Would it be possible for the monitor to receive the amount of time it took the gpu to render the frame and then adjust the length of the strobe accordingly? To maintain a consistent brightness?
There's some information in Electronics Hacking: Creating a Strobe Backlight about this theoretical possibility.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › [ASUS] Update: 27" 1440P GSYNC 120Hz RoG Monitor at CES