Overclock.net banner
1 - 20 of 80 Posts

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #1 ·
As we all have learned at this point the GTX580 has both hardware and software (driver) influenced power monitoring features. As it detects high current usage, it will purpose cut the clocks by 50% to decrease consumption. This is not an issue for gamers but this has been an absolute headache for reviewers as most still could not unlock the full power of the card with applications like Furmark and OCCT.



Well now we have the truth:

I have just discovered a review from Fudzilla (surprise surprise) that uncovers the true maximum temperatures and peak power consumption of the GTX580 with use of an older version of Furmark.

See here:

Quote:
Nvidia uses a new technology dubbed Advanced Power Management on the GTX 580. It is used for monitoring power consumption and performing power capping in order to protect the card from excessive power draw.

Dedicated hardware circuity on the GTX 580 graphics card performs real time monitoring of current and voltage. The graphics driver monitors the power levels and will dynamically adjust performance in certain stress appllications such as FurMark and OCCT if power levels axceed the cards spec.

Power monitoring adjust performance only if power specs are exceeded and if the application is one of the apps Nvidia has defined in their driver to monitor such as FurMark and OCCT. This should not significantly affect gaming performance, and Nvidia indeed claims that no game so far has managed to wake this mechanism from its slumber. For now, it is not possible to turn off power capping.

GTX 580’s power caps are set close to PCI Express spec for each 12V rail (6-pin, 8-pin, and PCI Express). Once power-capping goes active, chip clocks go down by 50%. It seems that this is the reason why GTX 580 scores pretty bad compared to GTX 480 in FurMark.


The Test

Quote:

FurMark temperatures didn’t go over 76 °C, which isn’t very realistic â€" in gaming tests we measured up to 85°C, which seems to suggest that Nvidia overdid the preventive measures.



GPUZ 0.4.7 doesn’t show downclocking during FurMark, but the new version is set to change that. Below you see the GPU temperature graph we captured during Aliens vs. Predator tests.



After overclocking, temperatures were at the same level as before. The fan was a bit louder, but still not too loud.


The Truth Unlocked (GTX580 is Unleashed)

Quote:

The older FurMark, version 1.6, shows that GPU can hit 90°C.



Consumption is on par with the GTX 480. You’ll find older FurMark test results below, because in new FurMark tests our rig didn’t consume more than 367W. During gaming we measured rig consumption of about 450W.


Not as bad as we thought. Although the consumption logically makes sense that is slightly higher (more cores, higher clocks), it's not even close to what the GF100 512SP test sample was. And heatwise, is definitely an improvement over the GTX480 either way, especially the cooler runs quieter and doesn't need as much fan speed to get to that temperature.

I hope this information is useful to you guys. In case you guys want to experiment with your own, I have provided links to both the older Furmark 1.6.0 as well as the latest 1.8.2 at the bottom. Enjoy.


Furmark 1.6.0 Download
Furmark 1.8.2 Download

Source
The Anti-Furmark Discovery (More Info)
 

·
Fortnite Fanatic
Joined
·
2,906 Posts
Maybe this is an attempt to eliminate Furmark? I doubt AMD will follow suit.

Huge difference in temps reported!
 

·
Premium Member
Joined
·
14,039 Posts
Quote:

Originally Posted by Projectil3 View Post
Seems odd to me, why would they purposely bunk a high-end card? The results suffered badly.
Because nobody cares about the Furmark BENCH scores, they aren't looked at by anyone. Reviewers DO however use Furmark for comparatively benching Temps and Power Consumption. And everyone knows that the older Fermi's run real hot, and suck the juice, and that (some) people don't like 'em for this reason.

Like I've said before, NV is basically cheating on Furmark so that this card looks like it's running much cooler and using less power in the review charts vs the 4xx series (and looks better vs. the competition).

Sure, they may have their 'plausible reasons' apart from that, but I don't really buy it. I think this technology is about marketing, pure and simple. Not that ATI wouldn't/hasn't done the same kinda thing themselves


In fact, these cards can suck up more power than 480's, which means they're generating more heat. They have to be. Now, there are some efficiency improvements, as they don't suck up as much juice as a 512SP 480 would, but they're not nearly as impressive as they appear to be when their 'furmark cheating mechanism' is allowed to operate.
 

·
Fortnite Fanatic
Joined
·
2,906 Posts
Quote:

Originally Posted by brettjv View Post
In fact, these cards can suck up more power than 480's, which means they're generating more heat. They have to be.
I did not realise that until this thread. I just assumed they were throttling furmark to look better compared to the 6970. I didn't realize they were really 'sucking up the power' 480 style. Hence the effectiveness of the Furmark throttling camoflage.
 
  • Rep+
Reactions: brettjv

·
Premium Member
Joined
·
14,039 Posts
Quote:

Originally Posted by Draygonn View Post
I did not realise that until this thread. I just assumed they were throttling furmark to look better compared to the 6970. I didn't realize they were really 'sucking up the power' 480 style. Hence the effectiveness of the Furmark throttling camoflage.
Yeah. Well considering that it's drawing the same as the 480 despite 10% higher clocks and 6% more shaders is still reasonably impressive. They DID make efficiency improvements, just not as much as it looked at first. Which is entirely what I suspected. You don't make bigger improvements to efficiency than this without much more significant changes to the chip i.e. a major redesign.

The lower temps are for the most part due to a better heatsink design IMHO. It'll be interesting to see the results when someone slaps the 580 cooler on a 480
 

·
AMD Overclocker
Joined
·
9,178 Posts
Yet another fail by nvidia. Why am I not surprised?
 

·
Registered
Joined
·
1,489 Posts
Doesn't matter.

If you care about (Real World) power draw and heat generation for testing stable clocks, run crysis, metro, vantage, 3dmark11 when it comes out, etc.

Furmark is as unrealistic for testing "Real" load conditions on a GPU as Linpack is for the CPU.
 

·
Registered
Joined
·
2,560 Posts
Quote:

Originally Posted by brettjv View Post
The lower temps are for the most part due to a better heatsink design IMHO. It'll be interesting to see the results when someone slaps the 580 cooler on a 480

Or vice versa??

also some of the 480s didn't get this hot

My 480 with stock cooling and the FAN on auto...

Only ever reached a max temp of 82Deg C

Yes it jumped up alot when i went to OC...
 

·
AMD Overclocker
Joined
·
9,178 Posts
Quote:

Originally Posted by Trigunflame View Post
Doesn't matter.

If you care about (Real World) power draw and heat generation for testing stable clocks, run crysis, metro, vantage, 3dmark11 when it comes out, etc.

Furmark is as unrealistic for testing "Real" load conditions on a GPU as Linpack is for the CPU.
There's no way that is a defensive fanboy response.

No way.

Come on, if I was an nvidia fan I would be disappointed in them for doing this.

Nvidia, you are a schmuck.
 

·
Registered
Joined
·
96 Posts
personaly dont care about the power consumption only care about the end results...if they built a card that drew 600 watts and blew away the compitition i'd buy it...i believe this is the same with any real enthusist...power draw is not the concern functionality is..
 

·
Registered
Joined
·
310 Posts
I think it's a little foolish to get all bent out of shape at nvidia & each other for something so trivial. Don't you guys have anything better to do?
 

·
Registered
Joined
·
1,066 Posts
Thats pretty sad, they have to resort to something like that to kinda hide real results..
 

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #14 ·
Quote:


Originally Posted by Draygonn
View Post

Maybe this is an attempt to eliminate Furmark? I doubt AMD will follow suit.

Huge difference in temps reported!

ATI actually did this two years ago using Catalyst.






Source
 

·
Registered
Joined
·
2,626 Posts
im an ati fanboy (okay not a fanboy but i generally look at ati's offerings before nvidias) and i think its Ludicrous that nvidia would do this... we all know that furmark stresses the gpu futher than any real world application but as you can see AVP pushed it pretty close to the furmark levels! why not just have it auto-downclock when it reaches a certain temperature no matter what? and just because ati did it doesn't make it any less wrong
 

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #17 ·
Quote:

Originally Posted by twich12 View Post
im an ati fanboy (okay not a fanboy but i generally look at ati's offerings before nvidias) and i think its Ludicrous that nvidia would do this... we all know that furmark stresses the gpu futher than any real world application but as you can see AVP pushed it pretty close to the furmark levels! why not just have it auto-downclock when it reaches a certain temperature no matter what? and just because ati did it doesn't make it any less wrong
Apparently NVIDIA (just like ATI) caught on with reviewers "overstating" the power consumption and heat their previous cards produced due to Furmark being "too intensive" and they decided to mascarade the Furmark results this time (pointing out that Furmark doesn't accurately represent the real world results, which is what most people care about). Personally, I agree, this was a pretty poor decision (from both companies). However, I will give NVIDIA credit for at least not-hiding the fact that this limitation exists and in fact pointing it out in their slides. Unfortunately back when ATI did this, they really said nothing about it (not really too much concern there though cause the workaround was just stupid easy - just renaming the Furmark.exe filename).
 

·
PC Evangelist
Joined
·
47,665 Posts
The the only good point GTX580 had over GTX480 (Lower power consumption, less heat) its a fake so from being not very impressive at start no its even more not impressive.
 

·
Premium Member
Joined
·
6,144 Posts
Quote:


Originally Posted by Open1Your1Eyes0
View Post

ATI actually did this two years ago using Catalyst.






Source

It probably doesnt have a crossfire profile.
 

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #20 ·
Quote:


Originally Posted by Kand
View Post

It probably doesnt have a crossfire profile.


Would be true however, that test was done with a single card.

Also here is another site trying a different test:

Quote:


Expreview has done the test with Quake Wars: Enemy Territory and has renamed etqw.exe to FurMark.exe and saw performance drop from 141.3FPS to 93.7FPS!



In fact just the opposite happened:

Quote:


In our opinions when GPU makers "optimize" their driver, they just tried to let games or benchmarks runs better. But now it seems ATI driver team adds a "profile" for Furmark, make it run slower, and dont burn GPU anymore.

Source 1
Source 2
 
1 - 20 of 80 Posts
Top