Originally Posted by tpi2007
When Nvidia releases Kepler, on a 28nm process that has since matured, AMD will have had the HD 7970 on the market for 6 months, and will also benefit form that same more mature process by then and release the 2304 core GPU that was spotted in a paper by Sapphire
Originally Posted by naizarak
lol AMD is playing everyone like a fool with these 7970's. they didn't include a whole shader cluster
and most likely downclocked these "Beastly OC" cards to remain in a specific performance margin - faster than a 580 but slower than the 6990. when nvidia launches their cards, AMD will release an 8970 with the added shaders and higher clocks, the way it should have been originally. everyone will say that it's a natural evolution of the design, 6month revision, etc. etc. but in truth they already have it.
Sorry to burst your bubble, but you're both wrong.
AMD officially denied any hidden cores in the Tahiti die. It's full spec is the 2048 cores and the Sapphire paper had false info:
Originally Posted by OwnedINC
60% is what you call a poor performance jump? Man you must really think Ivy is a flop then...
At least read the whole article before you try to post out of context snip-its of it.
Are you serious? I mean, have you actually checked ANY reviews or benchmarks?
The 7970 is 30% faster than the HD 6970. Not even close to 60%.
Originally Posted by Arni90
That "review" is so very wrong on a lot of levels:
3-way CFX does not
eliminate microstutter, not by a long shot. In my experience it's actually worse.
SLI is not smoother at 60 fps than CFX at 60 fps, it all depends on the game played. In my experience, it's just as horrible on either side.
Saying SLI has less microstutter than CFX based on framerate readings is the most idiotic assumption I've ever seen. When they don't even number the graphs I don't know if I should even be reading the article.
SLI does have signifcantly less microstutter, its a fact. Microstutter cant be measured / compared in a FPS reading, thats two totally different things and thats why this Toms Hardware review is totally wrong on every aspect since they are actually showing nothing else but the actual FPS across a certain time. They really believe a drop in your FPS is microstuttering but its not.
Microstuttering is simply the variation of frametimes between the two (or more) GPUs caused by inefficient communication.
For example GPU 1 renders frame 1 within 12 milliseconds while the next frame rendered by GPU 2 takes 30 or more milliseconds and so on. One frame will always be slower than the other and hence why there will be a small (hence micro) stutter. It's basically the delta between each frametime which causes this. It has absolutely nothing
to do with FPS drops or something like that, it appears everywhere, at 60 fps, at 100 fps, yes even at 1000 fps, theres no limit.
However, the amount of frames put out in a single second is so huge that it's virtually impossible to notice at such high framerates. General sayings tell that constant 60 fps is the magical border where microstutter wont be visible anymore. But of course its all down to each persons subjective perception as some people stated they would still notice microstutter above 60 fps.
A single GPU for example renders frames at a constant time, so each frame would be rendered in 30ms for example which makes it smooth as butter as theres no delta between the frametimes. Thats why microstutter on a single GPU is NOT possible (some people believe they are encountering microstutter with their singel GPUs but thats simply FPS drops and a total different thing)
It heavily depends on the game as to what extend the frametimes will differ, but generally speaking SLI achieves MUCH smoother frametimes in a most games. There are games where they perform equal or CF even shows less microstutter though, but thats a minority.
Heres some examples including actual frametime graphs with the Y-Axis showing the time (ms) in which each frame is rendered: http://www.computerbase.de/artikel/grafikkarten/2011/test-radeon-hd-6900-cf-vs.-geforce-gtx-500-sli/20/
Originally Posted by Neroh
I dont care if you dont take me seriously. Theres no microstutter with the 7970s on a detectable level and most people who actually have the cards will say the same thing
Whatever, you read an article and believe it to be gospel. Thats fine but dont preach about Crossfire without actually having personal experience with it, unless you want to be called out for being a moron by those who have experience
I'd believe someone with the cards over Toms Hardware any day lol
Sorry but facts / numbers overweight a persons subjective feeling.
Microstutter as with current SLI/CF technology IS there and is unavoidable. However, it is a totally subjective thing, some might notice it more than others, some wont notice it at all..., but saying there is no visible microstuttter just because a single person or even a group is not able to notice it, is beyond naive.
That Toms Hardware guide gives a total different impression of what microstutter actually is, though. They should delete that article, too many people have been misleaded by it.Edited by toX0rz - 1/25/12 at 4:39am