Originally Posted by salokin
Give me a break. Are you seriously arguing resolution is the main impetus for increase in graphics performance? Or that AMD/Nvidia have been slacking in the GPU department because ultra high resolution monitors aren't as prolific? I had a WUXGA (1920 x 1200) monitor nearly a decade ago, and now I'm using another WUXGA monitor. I guess graphics performance has stagnated since then, huh? The fact of the matter is, people like good looking graphics. Whether those graphics are on a 1080p screen or a 720p screen or a 1800p screen is irrelevant. That in and of itself is what drives better graphics performance, as game designers make better look **** and we need better cards to play that ****. If you think this resolution will have even a modicum of effect on future GPU evolutions, then I have a bridge to sell you.
What we have here is an unneeded resolution (breaking away from the standard high resolution 16:10 ratio: 2560 x 1600) with a way inferior GPU to power said resolution for all but the most mundane of tasks. People are free to disagree, but I see zero point is spending a lot of cash on a screen (which is the main selling point here) which cannot be completely utilized or supported. Apple should have just gone with a much more supported resolution (1080/1200/1440/1600) and called it a day. But no, that'd be too mainstream.
And I am still not convinced this thing will not overheat/throttle/be loud under load.
Yes, it's obvious that GPU processing power will increase regardless of the hardware it interacts with. What has happened is that when the market decided to stick with 1080p as "standard" for so many years, increases in power tended not to be put toward running higher resolution screens. Thankfully that wasn't entirely the case, with AMD's push for Eyefinity (and Nvidia's Surround support) we started to see products designed to run higher resolutions than the norm.
I would argue that an increase in screen resolution is one of the best ways you could improve image quality at this point. Lighting, physics, and geometry have progressed by leaps and bounds in the last decade, screen resolution is probably due for an overhaul.
I will agree that Apple probably should have stuck a better GPU in this thing, however, it may not actually matter so much. My guess is that Apple knows their market, and they know that people will not be buying a new MacBook to play Diablo 3 (I would be very interested to see how well Source engine games run though).
As for the resolution they chose, it is a direct progression of their established products. It's a Mac, they made it to run in their ecosystem, any other resolution probably would have caused software compatibility issues.