Originally Posted by twich12
im an ati fanboy (okay not a fanboy but i generally look at ati's offerings before nvidias) and i think its Ludicrous that nvidia would do this... we all know that furmark stresses the gpu futher than any real world application but as you can see AVP pushed it pretty close to the furmark levels! why not just have it auto-downclock when it reaches a certain temperature no matter what? and just because ati did it doesn't make it any less wrong
Apparently NVIDIA (just like ATI) caught on with reviewers "overstating" the power consumption and heat their previous cards produced due to Furmark being "too intensive" and they decided to mascarade the Furmark results this time (pointing out that Furmark doesn't accurately represent the real world results, which is what most people care about). Personally, I agree, this was a pretty poor decision (from both companies). However, I will give NVIDIA credit for at least not-hiding the fact that this limitation exists and in fact pointing it out in their slides. Unfortunately back when ATI did this, they really said nothing about it (not really too much concern there though cause the workaround was just stupid easy - just renaming the Furmark.exe filename).