Originally Posted by fleetfeather
Single 780 Ti pushes 1440p at 60-70fps. Suddenly a single 880 with push 4K at 60fps.
Dat 150% generational performance improvement
Well, I'm assuming most people on OCN will wish to overclock their cards at least somewhat (or just buy a pre-OC'ed card) which can result in a pretty significant performance boost, plus factoring in a nominal ~20% generational improvement, and then the icing on the cake is simply increasing to 4GB of vram with 512bit memory interface at ~7-8ghz should be more than enough to push the pixels required for 4k. Of course games will get harder to fun at ultra and whatnot, but it shouldn't be impossible by any stretch of the imagination.
I'm willing to bet that even if you didn't want to move to the next series of cards, you could get away with 4k with SLI/xfire and these benchmarks more or less support that claim (at stock) running some recent games with ultra settings, although I think at this point these cards are limited by their memory even in xfire/sli.
Originally Posted by Nixuz
Lots of people seem to know exactly what 800 series card will run at acceptable fps...
Never knew there were so many industry insiders.
And "most films" are NOT filmed at 4k 48fps.
That is a flat out lie. Even Bryan Singer isn't filming Days of Futre Past in HFS.
The ONLY movies being offered at even 1080p 48fps are the Hobbit movies, with the Avatar movies to follow.
And there are still a huge amount of people who don't even like 48fps for movies, which is SUBJECTIVE.
and 4k content matter, on PC or not.
You can run games at that res, but the textures and fill rate for 4k are going to be a generation or 2 away.
A 780TI gets 90ish fps on Crysis3 @1080p
bump the res up to 4x that and see what happens....
And even if it were to somehow happen, how much do you think a card with 12-16GB GDDR5 plus new arch is going to cost?
$1000 per card on the LOW end.
And thus, I claim a very low adoption for the next 3-5 years for any attempt at 4K.
And even after that, diminishing returns. Nobody needs retina displays just yet. At least not for movies and games.
My girlfriend works at the movies and they are in fact 4k. I'm waiting on her to reply to my text to verify.
Like I said, I won't expect 4k performance out of the box until a generation or two away, right now though we can come close especially if you overclock and/or do the whole xfire/sli bit.
As far as 'nobody needs retina displays', that's purely subjective. If I want a 4k display, and I'm willing to pay for it (to an extent) then sure I'll get it. Whether or not it floats your boat doesn't matter to anyone else, just like me wanting one doesn't matter to you. Having the tech industry shift to 4k can't hurt anyone as it just makes companies push for more power which helps all of us. For instance, if AMD and nVidia don't want to compete against each other and lower prices, well, we should at least get their hardware to earn their price tag by being able to play games at 4k/ultra or whatever.
Plus there are a whole bunch of reasons to get a 4k display that aren't just about picture quality. Sometimes its just plain helpful to have that many pixels on one monitor, especially if you need to see a lot of information at once.
Originally Posted by Nixuz
Originally Posted by anticommon
As far as 4k goes, there are companies already offering ultra high resolution content to consumers (ie Youtube, and soon netflix). Also, films are already being produced at resolutions over 1080p, and that's exactly what gets sent to the theatres for their releases. If I'm not mistaken most of them are already filmed at 4k/48fps. Burning that to Ultra HD Bluray should be little/no issue.00quote]
Where is this "UltraBluRay" you speak of?
And if not, who is the heck has a pipe big enough for 4k streaming outside of KC with Google Fibre?
Messed, do not know how to quote properly
Netflix has said (if I remember correctly) that a proper 4k stream should take up about 12-15 Mb of connection. That's a far cry from the 1000 Mb connection from google fiber, and I think many broadband connections will be able to cope. As far as ultra blu ray, well I don't know what it might be called but I do know that they are pretty consistantly advancing the tech in blu ray discs to be able to hold more and more data, and last I heard they could fit around 100GB onto one blu ray disc, which should be more than enough for a 4k film.
Originally Posted by BeyondCryptic
Most major films are filmed in 4k. 48fps? Probably not.
If they were not filming in 4k, they would slowly watch their footage rot when pixel density goes up in the future.
Back in the film days, it didn't matter what resolution it was filmed at because it was film. Digital, you have to care about it.
Now, in the early 2000s when digital cameras were starting to get pushed, they filmed at 1080p. What were the resolution on TVs way back then again? 480p at most. 720p if you were rich.
There really isn't any reason to not film in 4k Why not get the best picture quality you can out of your cameras and base footage, and then have some leeway to re-release films at higher resolution when they become more mainstream?