Originally Posted by Cyro999
Kepler is way more efficient than gcn, no?
No, not really. Just by looking at TPU's power charts, you could find evidence to support a claim the Kepler is more efficient, but you can also find the reverse. I have found that GCN puts up the more compelling numbers against GK104. GK110 on the other hand is the best, but wouldn't you expect that? 770 and 760 are NOT on GK114 and are instead GK104 like their predecessors.
Both sides have products that stick out for being very power efficient and some that make a poor showing. The real question is whether the 290X is more power efficient than the Titan. Kepler 2 vs. GCN 2. And really the 290X should be, it has been in the works awhile.
So to summarize: No. Kepler is not WAY
more efficient. It is a toss up.
The 7950 has better fps/watt ratio than the 760, and the same as the 660 and 660 Ti. I've always read that have a wider bus is worse for power efficiency. Will the 512 bit bus on the 290X cost in the fps/watt readings? Maybe they can pull it off like they did with the 7950 and the non-GHz 7970.
Bus width was always the reason I assigned to the 480/580 for being bad with power consumption and fps/watt, while the 460 decent (but still worse than the 6850 and 6870
). If you then look at the 7950 and see that it does pretty well despite the 384 bus, and that the 7870 and 7850 have amazing fps/watt numbers using the same bus size as the GTX 770/680, you may be inclined to think that GCN is the more efficient architecture. However, I know this way of viewing power efficiency is over simplified. The 680 in a sense, is equivalent to the 7870. They're both fully enabled midrange chips. Just like the 6870 vs 560 Ti. As it stands the 7870 has a slightly better fps/watt ratio, but worse actual performance. However, the 680 was retuned to compete against the 7970, while the 7870 was put into the middle of the pack and most likely purposefully tuned to use little power and not compete against the 7950.
This throws a wrench into the bus width to fps/watt correlation hypothesis because Nvidia could have dropped the clock speed a bit and undervolted the GK104 until it had a better fps/watt ratio and still maintained its pure performance lead. But does it really matter that this would technically make Kepler the more efficient architecture in this as close apples to apples comparison I could think of? Or should you only care about the efficiency of the actual products these companies sell as SKUs? The answer is really neither. While it can actually make a difference in how much you yearly spend, it really isn't that much. And we all overclock them and ruin any efficiency anyway. Here is another chart:
Thanks for reading this big stupid post.