Quote:
Originally Posted by

**SeanPoe**
The proof is many pages back, i assumed anyone this far in this thread would have (or should) have read the majority of the posts

. So i didn't think i needed to specifically re-say something that i already said in this thread.

Here (and i explain exactly how i got to that number) and excerpt:

-- snip --

I point out the difference between the two (28%), i point out the watt difference (104w underload) so anyone can easily calculate the saving they'd see based on their usage. This also doesn't take into account the 670's dynamic boost overclock which saves an additional power that's not being calculated here, so worst case scenario given this data, the difference is 28% but it might be even more.

Yes, okay, but ... doesn't it strike you as unlikely that your 28% calculation could be accurate, given it's so WILDLY different from TPU's analysis of the situation, when they used like 15 different games x 4 different resolutions?

Granted these are reference cards, but surely you're not suggesting that OC'ing alone could possibly account for an 14X LARGER difference in Perf per Watt (2% vs 28%)?

What do you think the odds are, statistically-speaking, of you 'getting it right' in terms of the 'population', based on a sample size of 1 benchmark, vs the chances that TPU's is more correct ... when they used well over a dozen?

I hate to tell you ... but it's nigh on impossible

Edited by brettjv - 5/14/12 at 4:29pm