Originally Posted by Whitespider999
Hey hypermatrix, one thing I have noticed in the few years of having a 2560x1440 screen is that lack of understanding from the general gaming community (not the rate elites) - in regards to what more pixels actually translates to.
This is in Australia, perhaps it's true in other countries as well.
To elaborate, a large number of gamers with below 1920x1080 monitors don't really see the big deal in getting a higher resolution, and it's rarely money holding them back. I understand this sentiment, as I had a 1920x1200 monitor for the culmination of years prior this upgrade.
Console devote's are just a more extreme version of this. And they own many-thousand-dollar Tv's running on hardware that's about as productive as a 200 dollar pc conjured up with spare parts from some deranged factory with purple heroin needles on the ground and stray mutant children run around mumbling things about silk stockings and magic toads.
The end result kind of leaves me understanding why microsoft and sony have comfortably help back on a newer console, why higher resolution output has not really evolved, and why new methods of transferring high resolution signals and high refresh rates have not been expanded on or truly evolved in the span of a teenagers life.
If someone is happy with something, then throw in the occasional improvement/bone (the witcher 2 /console looks better than a first gen xbox game) the gtx 580 is incrementally better than the gtx 280 etc. But don't ever give them something that would actually change the very nature of the scales.
Obviously, I am not a engineer of hardware. So there might be something that I am ignorantly omitting. But that's how it sometimes seems from my standpoint.
Few things. Most important of which (imo) is regarding consoles. Consoles, actually do quite well. They provide better graphics per mhz. Or, create better graphics per "performance mark." Reason being consoles are the exact same. Games can be optimized. Someone makes a PS3 game, they know how every single PS3 will perform, and build a game around it accordingly. Whereas a PC maker has to go over 5+ years of hardware and make sure it's compatible across them, along with 2 large video card manufacturers and their differing systems (ie. physx, txaa from nvidia), and onboard like Intel's chips.
Then we take a look at a game like, say...Modern Warfare 3. Blockbuster hit. Not the prettiest graphics, but a $250 PS3 can play it better than a $500 PC can. So you have to hand it to the PS3. But at the same time, the xbox/ps3 have held back the quality of PC games as well, because developers have toned down their system requirements, so it would be an easy port over to the consoles as well (ie. Crysis 2, as opposed to Crysis 1). But despite all this...most console games, don't even run at 720p (which is just a quarter of the resolution of these monitors). Which explains that performance. Now, when it comes to resolution...it's actually a lot tougher than you might think. Apple is pushing hard to expand resolution with all their talks of "retina" displays. The iPad's 2048x1536 resolution at 3.14 MILLION pixels for example, is 50% more pixels than your 1080p TV. But it's hard, because if you make a product that people don't know/think they want, you lose money. So only someone like Apple is ballsy enough to start a trend like this.
Now let's talk about bandwidth for a second. a 2560x1440 picture, is 3.7 million pixels. At standard 60 frames per second, that means 221 million individual pixel signals per second. So whatever the interface is, has to be able to dictate a specific colour to a specific pixel, at a rate of 221 million...every second. Let's put that in a size format we understand. If you have uncompressed video, so basically a 30 frame video is treated as 30 individual pictures (same as your monitor), that requires around 300 MB per second. Or, at 120hz which some of us are running it at, 1.25GigaBYTES (not bits) per second. Meaning a Terabyte hard drive's worth of video signals wouldn't be enough for for 15 minutes of video signal. Compare that to a 1080p bluray encoded video doing 2 hours in under 20gb. So what ends up being the issue is that it takes a "lot" of bandwidth to push out this type of resolution. And because the DVI standard is designed with copper cables, the bandwidth requirement of higher resolutions/refresh rates, exceeds the capacity of what copper itself can handle.
So...why not make a new/better port? Well...each revision of HDMI, for example, has worked on that. DisplayPort has increased it and will do so even further. But you have to remember that things have to come to a consumer-friendly price point before anyone makes it. And that's where it gets expensive. Because as far as technicality goes...yeah, make your DVI cables out of pure gold. That'll help up that resolution and refresh rate. =D And then on the other end of it, is the lcd's themselves. Miniaturization takes time. To make a higher resolution, you have to have the ability to create a panel with, for example, 8.8 million pixels ala 4K resolution. That's no easy task. And let's say that could be done. Games that required SLI to run at 1080p in full res (2 million pixels) now have to process 4.4x more data. And as you mentioned yourself...graphics processing power never goes up that much.
Which brings us to video cards...other than optimizations and new systems/etc...the biggest jumps happen with...again...miniaturization. Nvidia actually did a very good job this round. If you compare their old card to their new card, you're going to say...ok, it's only 40% more power...but what you're not realizing is, it's 40% more power, on a much smaller die. The old 580, for example, was a die size about 68% larger than the current 680's! So the new one is substantially smaller, and still upto 40% more performance! Nvidia realized this, and intentionally held the power back here.
They had too big of a jump and trumped AMD pretty hard. But even if that extra 68% translated to 68% more performance (doesn't really work that way), it still wouldn't be enough to power higher up displays. So Nvidia doesn't bother trying to.
So at the end of the day...it's a long cycle of one displays, video cards, software, interface systems, etc...all waiting to see who makes the jump, and if it's profitable, and if they can even keep up with it before they try.
As for Microsoft and Sony, yeah. They'll take their time. The longer they wait, the better performance they can put in their new consoles. If they were to build a system today...with a $250 price point, or even a $400 price point...honestly, they'd be hard pressed to get more than maybe 50% increased performance over their old product. And even that's being optimistic, and mostly relying on having some decent ram for once. They'd rather wait a little until they can build a system that can put out full 1080p, with good resolution, so at least they'll be covered in the nearly 100% of households who own 1080p tv systems now.
Why would I spend 6k on my computer system though, for example? Why do people SLI? And why is a 120hz 2560x1440 monitor beneficial? Well...in FPS's, that's a huge edge. Higher refresh rate lets you track people better. Larger screen + Higher Resolution means you can spot that pesky sniper way in the back of the map, instead of it being a giant blur. There are tons of advantages. But there simply isn't the processing power to handle it. And I've given up convincing people on the differences in resolution after I had people tell me they can't tell the difference between DVD and BluRay. I've given up on humanity in that regard. But...here I am, enjoying my monitor. Beating 1080p'ers with their 60hz monitors in FPS's. Never been happier. And could never go back to 60hz.
And yes, I honestly feel like I have a huge edge and my scores in MW3 reflect it.
That...was long. Ok. I'm done.
p.s. Regarding The Witcher 2. That is a terrible game, when it comes to graphics. It's so inefficient. And looks terrible! I mean, I'd show it off as a masterpiece of DX9...but DX11 games blow it out of the water.
Also....difference between 1080p and 1440p? Easy to show why it's better. Tell them this. (obviously click the pics to view them in a larger format, though still not at 2560x1440 view, sadly. not sure why it shrinks the large one)
This is what you see:
And this is what I see:
Which one would you rather be looking at if you were playing online right now?Edited by HyperMatrix - 4/16/12 at 7:57am