Originally Posted by STEvil
I see the 980 at almost 1600mhz barely matching the Tri-X in 1440 and 4k, at best. 1080p results are a wash if one card shows higher numbers at 1440/4k as it will indicate better lifecycle.
Oh, and memory frequency wasnt listed on the 980, so its either default 7000mhz or near to 8000mhz.
If he didnt touch the memory, whats the point of a "max overclock vs max overclock" if its not max overclock, though?
Barely matching? Did you even watch the whole video or did you just see a few slides and then form a conclusion prematurely? The Tri X gets rocked at 1080p, at 1440p it is even (3 games go to the 980 @ 1440p out of a total 6 tested), and at 4K as stated previously it is entirely irrelevant numbers other than indicating the fact that you need to have two in Crossfire (or two in SLI) to have a playable experience. I mean really, in Far Cry 4 @ 4K the Fury gets 33 FPS and the 980 Gets 31. Still unplayable for both. Then that brings a whole other argument and variable into the discussion, which is, Crossfire vs SLI. Which I already detailed in other posts.
Besides all of that, I'm not having an argument over semantics in regards to what defines a "Max Overclock" even though Jay overclocks his memory as well and has videos discussing it. Nevertheless, we both know that listing the core clocks are much more important though.