Overclock.net › Forums › Components › Monitors and Displays › NVIDIA G-SYNC Display +VG248QE
New Posts  All Forums:Forum Nav:

NVIDIA G-SYNC Display +VG248QE  

post #1 of 7
Thread Starter 
This very exiting news for everyone that owns this monitor or looking to buy one thumb.gif
Quote:
Later this year, our first G-SYNC modules will be winging their way to professional modders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available. These modded VG248QE monitors will be sold by the modding firms at a small premium to cover their costs, and a 1-year warranty will be included, covering both the monitor and the G-SYNC module, giving buyers peace of mind.

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

EDIT ; don't know how I missed this (@ mod. you can delete this thread since there is another one open in news section ...sorry )
http://www.overclock.net/t/1435176/nvidia-announces-the-revolutionary-g-sync
Edited by coolhandluke41 - 10/18/13 at 10:26am
    
CPUMotherboardGraphicsRAM
i7 4930K Rampage IV Gene SLI Evga GTX 780 Classy 16Gb TeamX2400c9 
Hard DriveCoolingOSMonitor
Samsung 840 Pro/ Hitachi Deskstar 1TB  H20 (NexXxoS 420+280) w7x64 VG248QE 
KeyboardPowerMouseAudio
Logitech G110 SuperNOVA 1300 G2 G502 Proteus Core Foobar2K/WASAPI >NuForce Icon HDP >Crack with S... 
  hide details  
    
CPUMotherboardGraphicsRAM
i7 4930K Rampage IV Gene SLI Evga GTX 780 Classy 16Gb TeamX2400c9 
Hard DriveCoolingOSMonitor
Samsung 840 Pro/ Hitachi Deskstar 1TB  H20 (NexXxoS 420+280) w7x64 VG248QE 
KeyboardPowerMouseAudio
Logitech G110 SuperNOVA 1300 G2 G502 Proteus Core Foobar2K/WASAPI >NuForce Icon HDP >Crack with S... 
  hide details  
post #2 of 7
Well it certainly sounds interesting. I'm curious as to how much it will cost.

While it's good to hear that top-dogs in the industry think it's good, I am most interested in hearing what the invite players of CS:GO and TF2 have to say about it.

I wonder if it will work with BenQ models too, like my XL2420T. I wonder how much tinkering is involved to "install" it.
2600k
(18 items)
 
  
CPUMotherboardGraphicsRAM
2600k Asus P8Z68-V/GEN3 GTX 460 SC G.SKILL Ripjaws Z 
Hard DriveHard DriveHard DriveHard Drive
Crucial M4 Crucial M4 OCZ Agility 3 Seagate 
CoolingOSMonitorMonitor
Corsair H100 Win7 BenQ XL2420T Qnix QX2710 
KeyboardPowerCaseMouse
Leopold Otaku Corsair AX850 Corsair Obsidian 650D Zowie EC1 eVo & Razer Mamba 
Mouse PadAudio
SteelSeries QCK+ & SteelSeries Experience I-2 Harmon Kardon 2.1 & Logitech G930 
  hide details  
2600k
(18 items)
 
  
CPUMotherboardGraphicsRAM
2600k Asus P8Z68-V/GEN3 GTX 460 SC G.SKILL Ripjaws Z 
Hard DriveHard DriveHard DriveHard Drive
Crucial M4 Crucial M4 OCZ Agility 3 Seagate 
CoolingOSMonitorMonitor
Corsair H100 Win7 BenQ XL2420T Qnix QX2710 
KeyboardPowerCaseMouse
Leopold Otaku Corsair AX850 Corsair Obsidian 650D Zowie EC1 eVo & Razer Mamba 
Mouse PadAudio
SteelSeries QCK+ & SteelSeries Experience I-2 Harmon Kardon 2.1 & Logitech G930 
  hide details  
post #3 of 7

New monitors will be required for nVidia G-Sync.
It is a variable-refresh rate technology (asynchronous monitor refreshing!).
Refresh rates are no longer a discrete schedule with nVidia's G-Sync.

Blur Busters has commented on its pros:
http://www.blurbusters.com/nvidia-g-sync-variable-refresh-rate-monitors/

AnandTech has the best explanations (great screenshots of nVidia's powerpoint presentation)
http://www.anandtech.com/show/7432/nvidia-montreal-event-live-blog

Pros:
* The pros of VSYNC ON combined with the pros of VSYNC OFF -- best of both worlds
* Lower input lag at all framerates
* Eliminates stutters during varying framerates
* Eliminates tearing
* Varying framerates now looks much better

Interesting Behavior:

Interesting Behavior:
* Display motion blur now becomes directly proportional to framerate. (in non-strobed mode). Display motion blur (sample-and-hold) gradually reduces the higher the framerate you go, up to a certain limit (144Hz). Just like 120fps@120Hz has half the display motion blur of 60fps@60Hz, you now have a continuously variable display motion blur, up to the display's maximum framerate/refreshrate. It's like displays finally got CVT that runs at all times while you play a game. Continuously variable transmission, instead of "gears" (60Hz, 75Hz, 85Hz, 100Hz, 120Hz) that requires you to pause a game to switch.

Cons:
* Motion blur won't be better than LightBoost. At best, it's similar to 144Hz.
..... (until the G-Sync max framerate limit is raised, e.g. future 240Hz/480Hz monitors)
* nVidia Lock-in (which may not be a problem for some)

Wishlist:
* LightBoost combined with G-Sync. Variable-rate strobing is reasonably practical above a certain frame rate (requires ultra-precise strobe modulation to prevent brightness udulations during variable frame rates).
---OR---
* Variable refresh rate monitor with a higher frame rate limit than 144Hz, for PWM-free flicker-free LightBoost-like motion clarity.
This is harder because Flickerfree LightBoost-like clarity won't occur until approximately 400fps@400Hz (and up), and current LCD panels cannot yet be refreshed at that frequency yet.

John Carmack actually mentioned combining strobing and G-Sync, so it might be eventually possible too. Blur Busters approves of the G-Sync methodology. It's a good stepping stone to tomorrow's "Holodeck" (unlimited-refresh-rate displays that no longer requires the "CRT bandaid" of strobing to eliminate motion blur) in long-term humankind progress.
Edited by mdrejhon - 10/18/13 at 9:36pm
post #4 of 7
Photographs from the liveblog are very self explanatory for the technologically-minded (Blur Busters Squad fully understands -- as does display engineers, and people with good understanding of displays)













Variable Refresh Rate Monitors (e.g. G-Sync) allow synchronizing the variable framerate of the game, to a monitor, completely eliminating stutters, completely eliminating tearing, keeping input lag low.

And, technologically it's theoretically strobe-compatible (With special LightBoost modifications) assuming a minimum framerate (to avoid repeat strobes), although I'm not sure if variable-rate strobing will be included as a feature (yet), but the fact that a few big names such as John Carmack talked about it already, is damn exciting to me as Chief Blur Buster. Clever engineering of strobe length allows variable strobing to look unnoticeable (at least above flicker fusion threshold) with no brightness or flicker udulations, but it's quite difficult monitor engineering. (e.g. Electronics Hacking: Creating a Strobe Backlight shows the complex engineering that goes into high-efficiency strobe backlights).
post #5 of 7
I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync:
Quote:
From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.

Edited by mdrejhon - 10/19/13 at 1:00am
post #6 of 7
Thread Starter 
Thanks mdrejhon thumb.gif
    
CPUMotherboardGraphicsRAM
i7 4930K Rampage IV Gene SLI Evga GTX 780 Classy 16Gb TeamX2400c9 
Hard DriveCoolingOSMonitor
Samsung 840 Pro/ Hitachi Deskstar 1TB  H20 (NexXxoS 420+280) w7x64 VG248QE 
KeyboardPowerMouseAudio
Logitech G110 SuperNOVA 1300 G2 G502 Proteus Core Foobar2K/WASAPI >NuForce Icon HDP >Crack with S... 
  hide details  
    
CPUMotherboardGraphicsRAM
i7 4930K Rampage IV Gene SLI Evga GTX 780 Classy 16Gb TeamX2400c9 
Hard DriveCoolingOSMonitor
Samsung 840 Pro/ Hitachi Deskstar 1TB  H20 (NexXxoS 420+280) w7x64 VG248QE 
KeyboardPowerMouseAudio
Logitech G110 SuperNOVA 1300 G2 G502 Proteus Core Foobar2K/WASAPI >NuForce Icon HDP >Crack with S... 
  hide details  
post #7 of 7
Since there is a news thread on this and per thread starter request on OP will lock this thread.

http://www.overclock.net/t/1435176/geforce-nvidia-announces-the-revolutionary-g-sync
     
  hide details  
     
  hide details  
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
This thread is locked  
Overclock.net › Forums › Components › Monitors and Displays › NVIDIA G-SYNC Display +VG248QE