Originally Posted by Mand12
But even when this happens, because it's a G-Sync 120 Hz, as you drop in framerate from 120 to 110 to 100 to 90 as the system ages, you're still having an amazing, smooth, crisp experience, all the way down to the 40s.
Your tolerance for framerate will determine how often you upgrade, but the point that this isn't only
a high-end monitor - it is a high end monitor, but it scales downward to lower FPS better than anything else in the world.
Framerate tolerance is true no matter what screen you have... I upgrade usually the moment i cant play a AAA title at its fullest without it being a smooth experience (and i research and ensure its NOT an isolated issue.. I remember building this rig with the original 570 in it and Dragon Age II came out around the same time and I remember firing it up and getting 20fps.. it was an engine issue with nVidia cards that a patch corrected) I also tend to analyze whether a particular engine is a good example of current gen, if its a decently well optimized engine/what have you and im not getting the FPS i desire at the settings i want, i start looking to see if anything new is going to give me enough of a jump to invest in.
If yes, i invest and enjoy my new GPU hardware.. In general my cycle is roughly 3 years when i do it right, 1.5-2 if i do it wrong, and 3 weeks if i try AMD /rimshot
Originally Posted by Leviathan25
My feeling is that after this year, the industry is going to push towards 4k gaming. Video cards will rise to try to support this technology, because otherwise the monitors will not sell. A 1440p monitor would then get caught and "passed by" in their efforts to deliver single graphics cards that can do 4k @ 60hz.
Or the monitor makers wont care and just push 4k anyways, 4k is going to be the new "Full HD" and EVERYONE who knows nothing about computers except what marketing tells them will want it... 4k will be everywhere, its up to nVidia and AMD more than the panel makers to make sure we gamers have the power for when that time comes
And monitor makers might continue to push out existing 1440p screens, they just might not update or release new models unless the market is good for it, while I think theres a large untapped market for the RoG Swift, im not sure the market would support say, Eizo, Benq and Asus all fighting over it...
Originally Posted by Perfect_Chaos
I can indeed tolerate fps dips down to around 40-45fps, i really start to feel it once it goes lower than that.
Good point and hopefully this is true, people with lesser resolution monitors will then have no problems. I'm currently running a samsung 120hz screen with 1080p and at 27" so the resolution bump should be perfect at this screen size. For games this one is great, but finer things like text isn't as clear (or fuller?) as it could be, the bump to 1440p would probably cure that.
Yeah 1440p will make the text sharper, I looked at 27" 1080p monitors and just went meh, at your average viewing distance everythings just too.. Big... Good if you're older i suppose and want the windows large, but for someone like me wanting lots of screen real estate its not ideal...
Originally Posted by Aemonn
I've got no evidence aside from my gut feeling, but the bump of memory for the 780 and 780ti means to me that Nvidia want's to keep up the sales momentum and give people something to continue buying while they delay the release of the 800 series.
Why would they saturate the market with a new/revised gpu and then promptly undercut it (technology wise) with a new architecture a few months afterward? I mean, I can see it happening but I also see it as possibly taking some of the attention away from the 800 series.
I predict we'll see the 880 sometime after September, possibly later unless AMD forces the move. Still, I'd wait for it (and am).
Isnt it just EVGA/Cardmakers doing 6gb 780 Tis? This might not be anything more than a midlife bump to the 780 Ti to clear out remaining 780 Ti chips...