Originally Posted by ChevChelios
4096 Vega wont beat your above average 980Ti by any considerable margin (if at all) either
Wait what? 78% more cores than Polaris 10 or whatever the 480 is. HBM, meaning bandwidth will be no issue, on Vega. Therefore we can use the 480-performs-like-a-390X numbers as opposed to the 480-performs-like-a-290 numbers. Presumably Vega will be clocked the same as Polaris, at about 1100-1200MHz.
Now. Let's look at a 390X vs a 980Ti. For example, here
. This is one of those factory overclocked to a ridiculous degree cards, beating a standard 980Ti by about 20-25%, and a basic 390X does 66% as well minimum
. 0.66*1.78 = 1.17, meaning Vega ought to do about 15-20% better than even the fastest 980Tis, probably a bit less (but not significantly) because GPUs don't scale perfectly.
TL;DR math says you're wrong and Vega ought to be pretty good.
Originally Posted by DarkIdeals
He specifically says that he COULDN'T get the 1080 low enough to meet the 980 TI's clock speed (further evidence that he doesn't know what he's doing as you can simply get a software capable of underclocking) So he just goes and CUTS THE POWER TARGET IN HALF!!
This does NOT "just" lower it to the same clock speed as the 980 TI, it STARVES the card! The GTX 1080 with 50% power target only has a 90 watt TDP to work with in this case which will drastically interfere with how the memory and general operations perform. It's outright unfair to cripple the TDP on the 1080 to 90 watts and then claim its a FAIR comparison to a 250 watt 980 TI!
That's not necessarily true, and in fact was one of the reasons the R9 Nano can do as well as it does vs the Fury X (about 90%) while consuming a bit more than half the power. Tests done here.
I believe AMD said that officially as well, but I can't find the quote. The same idea might apply here. With lower frequencies, you need less voltage, and processors scale with the square of the voltage. If the average frequency is 90% and the average voltage can be just 20% less, you're looking at under 60% the power consumption vs stock and 90% the performance, a dramatic increase in energy efficiency.
That said, that only applies to the processor. I do not know if memory follows the same trends (I assume yes) and if its voltage and frequency are dynamic in a modern GPU, or at least dynamic enough to make a difference. This also assumes that Nvidia's power targets are "smart" enough to lower the voltage. If they just cut the frequency then there might be an issue.