Originally Posted by PureBlackFire
TDP is not a strict measure of power consumption, but more related to heat (they are both related). AMD has for the past 3 generations labelled their flagship gpu with 250 watt+ TDP. the fact is the 6970 doesn't consume anywhere close. power consumption is a few watts more than nvidia's gtx560ti, both well under 200 watts. Nvidia has in the past (before kepler/maxwell) been very generous to say the least with how they label their card's TDP with the higher end models actual gaming power consumption being higher than the card's rated TDP. I said "pretty much" because as in typical fashion things progress (we can hope) and this is no longer true, but we cannot ignore recent history altogether. the 7970 also with it's 250 watt TDP consumes under 200 watts, only being around 15 watts more than the gtx680. now the GE is another story. a small overclock bumps heat and power consumption by a relatively fat margin compared to the increase in clocks and performance in my opinion. this is where AMD is stuck right now in the gpu market (GCN). they have a pretty big efficiency problem that as of now is just gettting worse. the point was that TDP=/= power consumption and AMD estimates high very often while nvidia estimates low more often. as MadRabbit pointed out, if you look at the power consumption vs TDP of the gtx 970, you get an idea. the HD7950 has a TDP of 200 watts and the HD7870 was 175 , 130 for the 7850 etc. the 970's stock power consumption is around the same as the 7950, but it has a TDP lower than the 7870 and 7950. the 7850 consumes around 86 watts at load with a 130 watt TDP. it is rare even with maxwell and kepler to see this kind of power consumption vs rated TDP trend on nvidia gpus.
ok. . . i think. (if i have it straight as you explained it) right after i posted i did consider the difference between power draw and heat dissipated that is measured in watts. now since the heat dissipated is "wasted energy" than, as you said yourself, if amd is having efficiency problem - than wouldn't it be entirely reasonable that their tdp would be higher than nvidia's?
Originally Posted by Imouto
What about reading the testing methodology?
Originally Posted by TPU
we measure the power consumption of only the graphics card via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution is used for all measurements. Again, the values here only reflect the card's power consumption as measured at its DC inputs, not that of the whole system.
is there something wrong with isolating the power draw of just the card
? that would be more accurate than measuring the whole system, no?