Yeah, I dont' get that attitude. I see people say all too often that GPUs are too slow for 4k. Now make no mistake, you'll need a good GPU for 4k - but at the same time, you'd have to be pretty stupid to sit there at 20 fps with ultra settings and complain. If you lower the settings a tad I promise it won't affect image quality at all while you can increase framerate up to 60 or 80 fairly easily, generally speaking.
I play at 1600p and i've played a ton of games at ultra vs high. The framerate difference is usually insane (20-40 fps) while the image quality is no different. Case in point, crysis 3. I've compared VH vs H so many times and there is no image quality difference. Really. Seriously. Unless my 20/20 fails me. I'm sure 4k is doable with tweaking, I really don't think you need quad SLI for 4k resolution which a lot of people tend to make that claim; I had the same concerns at 1600p and 1600p is fine with a single high end GPU. No, you can't run everything at ultra. Lower to FXAA or 2X MSAA (8X MSAA is the stupidest and silliest setting you can use, BTW) and lower 1 or 2 settings. Boom. Done. Most games can gain a ton of framerate this way without a loss of image quality.
Yet people will still blindly set ultra across the board and whine when their framerate is too low. It's whatever I guess. Even at 1080p there are games that can't sustain 60 fps at ultra settings. So to see people complain about this sort of thing is mind boggling. It's like they're incapable of using their common sense to lower 2-3 settings. And like I said, it really doesn't affect image quality. Also, 8X MSAA is the silliest setting anyone can use. It uses a ton of VRAM for no good reason, it really isn't worth ever using over 2X MSAA or FXAA considering the insane performance hit that 8X MSAA uses. 8X MSAA also uses a ridiculous amount of VRAM.
Edited by xoleras - 4/1/14 at 2:31pm