Overclock.net › Forums › Graphics Cards › NVIDIA › Will Maxwell be postponed? UPDATE: G-SYNC & GTX 780Ti ANNOUNCED AT NVIDIA'S MONTREAL CONFERENCE!
New Posts  All Forums:Forum Nav:

Will Maxwell be postponed? UPDATE: G-SYNC & GTX 780Ti ANNOUNCED AT NVIDIA'S MONTREAL CONFERENCE!

Poll Results: Do you believe Nvidia will postpone their Maxwell architecture?

Poll expired: Apr 7, 2014  
  • 35% (6)
    Yes! I believe Nvidia will postpone.
  • 29% (5)
    Yes! Nvidia will probably re-brand many of their current models. Top-tier models may be a revised version of Kepler.
  • 29% (5)
    No! Nvidia may re-brand select Kelper GPUs. The top-tier models will feature Nvidia's Maxwell architecture.
  • 5% (1)
    No! Nvidia will deliver as I expected!
17 Total Votes  
post #1 of 9
Thread Starter 
AMD's re-branding scheme was not what I had hoped for. Other than the R9 290 and R9 290X, we are stuck with choosing between AMD's older series of GPUs. Not too big a deal breaker. AMD's R9 280X has more than enough processing power for a lot of gamers. 4k resolution monitors are still relatively new and impractical for average users who are on a modest budget. I have a good feeling Nvidia will postpone their newest architecture until late 2014; potentially early 2015. I would rather it not happen of course. Do you think they will? This thread is not meant for a versus discussion. Please remember this thread post is mere speculation.


___
Edited by Geek Branden - 10/18/13 at 10:43am
post #2 of 9
Thread Starter 
NEW!

I am watching day two of the Nvidia Montreal conference right now. They announced G-Sync (not a V-Sync variant) for Kepler GPUs. An additional proprietary PCB is added to the monitor so the Kepler or newer GPU(s) control it. One of the main features of this is to keep the refresh rate synchronized. It therefore eliminates tearing that one would see when traditional V-Sync is turned off and lag when traditional V-Sync is turned on. Another benefit of allowing the GPU to control the monitor is can/will improve color accuracy. There are monitor manufactures already lined up to add this to future models. It will not work with existing ones. I very highly doubt this would work with AMD Radeon GPUs. They also announced a new Kepler GPU: Geforce 780Ti! I apologize for the vague details. G-Sync took me by complete surprise. Stay tuned for more specific details elsewhere on the web! tongue.gif


__
Edited by Geek Branden - 10/18/13 at 10:01am
post #3 of 9
I cant stop wondering if there will be non-reference 780Ti:D
post #4 of 9
Thread Starter 
Quote:
Originally Posted by FlankerWang View Post

I cant stop wondering if there will be non-reference 780Ti:D

I am a hundred percent certain there will be non-reference 780Ti GPUs. They gave few details about it. It is going to be released in November.



___
Edited by Geek Branden - 10/18/13 at 10:01am
post #5 of 9
Thread Starter 
I am watching Linus's live stream of the conference. It is still going.



___
Edited by Geek Branden - 10/19/13 at 1:21am
post #6 of 9
Thread Starter 
Here is a video recording of the announcement from HardwareCanucks: http://www.youtube.com/watch?v=M1bEJDzft-A about the Geforce 780Ti.
post #7 of 9
I actually tuned into Linus's live stream during the G-Sync presentation.

G-Sync sounds bad ass. Its about damn time som1 done it this way. having a static refresh rate is kinda backward now that u think about it.

I just pray that Asus's 39" 4k monitor has this as a feature or even an optional DIY feature..that would just straight up rock and help soften then almost certain exorbitant price tag Asus is likely to put on said monitor. Its coming out somtime in Q1 of next year apparently so they have time to add it.
Stryker LGA 2011
(24 items)
 
  
CPUMotherboardGraphicsRAM
3930k @ 4.6ghz &1.376v P9X79 Pro GTX 780 Classified G.Skill 16GB 1866mhz CL9 
Hard DriveHard DriveOptical DriveOptical Drive
Samsung 850 Pro 1TB Corsaire Performance Pro 128gb CD/RW DVD/RW 
CoolingCoolingCoolingCooling
EKWB Supremecy Evo EK-FC780 GTX Classy - Acetal+Nickel WaterBlock AlphaCool NexXos Monsta 240 Radiator Front w/ ... XSPC D5 Photon 270 Pump&Res 
CoolingCoolingOSMonitor
XSPC AX240 White Top w/ Push XSPC AX 240 White External Rear w/ Push Windows 7 Premium 64 Panasonic 42" Plasma (60hz) 
KeyboardPowerCaseMouse
Logitech G15 Seasonic Platinum 1000w CM Storm Stryker Mad Catz R.A.T 7 MMO 
Mouse PadAudioAudioAudio
Mad Catz G.L.I.D.E 7 Logitech Z5500 Speaker System Asus Xonar D2X FiiO X3K 
  hide details  
Reply
Stryker LGA 2011
(24 items)
 
  
CPUMotherboardGraphicsRAM
3930k @ 4.6ghz &1.376v P9X79 Pro GTX 780 Classified G.Skill 16GB 1866mhz CL9 
Hard DriveHard DriveOptical DriveOptical Drive
Samsung 850 Pro 1TB Corsaire Performance Pro 128gb CD/RW DVD/RW 
CoolingCoolingCoolingCooling
EKWB Supremecy Evo EK-FC780 GTX Classy - Acetal+Nickel WaterBlock AlphaCool NexXos Monsta 240 Radiator Front w/ ... XSPC D5 Photon 270 Pump&Res 
CoolingCoolingOSMonitor
XSPC AX240 White Top w/ Push XSPC AX 240 White External Rear w/ Push Windows 7 Premium 64 Panasonic 42" Plasma (60hz) 
KeyboardPowerCaseMouse
Logitech G15 Seasonic Platinum 1000w CM Storm Stryker Mad Catz R.A.T 7 MMO 
Mouse PadAudioAudioAudio
Mad Catz G.L.I.D.E 7 Logitech Z5500 Speaker System Asus Xonar D2X FiiO X3K 
  hide details  
Reply
post #8 of 9
Thread Starter 
Quote:
Originally Posted by SolarNova View Post

I actually tuned into Linus's live stream during the G-Sync presentation.

G-Sync sounds bad ass. Its about damn time som1 done it this way. having a static refresh rate is kinda backward now that u think about it.

I just pray that Asus's 39" 4k monitor has this as a feature or even an optional DIY feature..that would just straight up rock and help soften then almost certain exorbitant price tag Asus is likely to put on said monitor. Its coming out somtime in Q1 of next year apparently so they have time to add it.

I would be tempted more than ever to upgrade to a 4k resolution monitor with G-Sync in it. lol I use my monitor(s) for more than just gaming. I hope Maxwell will push out seven+ Teraflops. Gaming at such a high resolution right now requires a dual or tri 780 SLI setup for decent frame rates. A DIY feature is probably never going to happen. I think other manufactures will hop onto the bandwagon with their own spin-off of G-Sync. Average gamers might see it as a gimmick.


___
Edited by Geek Branden - 10/19/13 at 1:50am
post #9 of 9
Crosspost, because people are asking about combining LightBoost with G-Sync:
Quote:
Originally Posted by Pocatello;1040297646 
Mark,
What is best right now for FPS gaming? Lightboost or G-sync?
Depends.
Variable framerates; use G-Sync
Constant 120fps@120Hz: use LightBoost.

G-Sync is still limited by 144Hz motion blur; it would take G-Sync at 400fps@400Hz to achieve flickerfree LightBoost CRT motion clarity (2.5ms sample-and-hold length), based on existing motion blur math formulas directly co-relating strobe length with motion blur. So frame durations would have to match today's LightBoost strobe lengths. Using 2.5ms means frames have to last only 1/400th of a second, and that would require 400fps@400Hz using nVidia G-Sync. If you want the clarity of LightBoost=10%, you need about 700fps@700Hz (1.4ms frame duration length) and upwards. That's not feasible. So we still need strobing, at least until we have a 1000Hz LCD (to have flickerfree CRT motion clarity). It's surprising how human eyes still sees indirect display-induced motion blur (sample-and-hold effect), even at 240Hz, 480Hz, and beyond (as we witness human eyes can tell apart LightBoost=10% (~1 to 1.4ms frame duration) versus LightBoost=100% (~2.5ms frame duration) during motion tests such as www.testufo.com/photo#photo=toronto-map.png&pps=1440 ...) Display motion blur is directly proportional to visible frame duration times.

LightBoost is great at triple-digit framerates. However, we already know LightBoost becomes terrible at lower framerates, and can become very stuttery at less than triple-digit framerates. G-Sync stays smooth during varying framerates; LightBoost does not. The solution is to Combine LightBoost AND G-Sync. This solves the problem. However, new problems occurs with variable-rate strobing. Fortunately, I've come up with a successful solution.

I've found a way to combine the two. John Carmack did say it was possible in his twitch.tv video, but it didn't appear that anyone came up with a novel idea of blending PWM-free with LightBoost strobing, an enhancement that I have just come up:

I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync, as a workable method of fix strobing flicker on variable-refresh-rate displays:
Quote:
From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › Will Maxwell be postponed? UPDATE: G-SYNC & GTX 780Ti ANNOUNCED AT NVIDIA'S MONTREAL CONFERENCE!