Originally Posted by 47 Knucklehead
You want a clear and concise statement about all these cards?
GTX 780, GTX 780Ti, Titan, R9 290X .... doesn't really matter. They ALL will play any game out there at either maximum settings, or near maximum settings at over 55+ FPS at just about any realistic resolution (ie under 2560x1440). The whole 4K monitor thing is a hoax. They cost $3500 a panel now and will be over $1000 for some time to come ... odds are, longer than any of those cards will have a useful life for just ONE of them. If you get a second video card and SLI/CrossfireX them, then they will still smoke any game coming out in the next 2-3 years, including 4K.
So get what you like (nVidia or AMD) and what you are comfortable opening your wallet to. They all are pretty much within 5 FPS of each other. All the other BS here on this thread, and on OCN in general, is just a giant ep33n waving contest for the last 1-2% of speed.
Let me guess, you've never heard of 120hz/144hz gamers right? Yeah for your typical 60hz gamer the difference bewteen top-tier graphics cards is neglble and 50+ fps is "smooth", but for me (and others) anything less than 80fps is not good enough, preferably 100fps+.
That's where the differences between the graphics cards really starts to matter, getting more fps is simply a case of using a mix of med/high/ultra settings (i.e. tweaking).