Overclock.net › Forums › Graphics Cards › NVIDIA › Truth on GTX580 Furmark Temps and Consumption
New Posts  All Forums:Forum Nav:

Truth on GTX580 Furmark Temps and Consumption - Page 3

post #21 of 80
So.

Does this phenomenon still occur in the later drivers?
post #22 of 80
Thread Starter 
Quote:
Originally Posted by Kand View Post
So.

Does this phenomenon still occur in the later drivers?
Unfortunately, I don't happen to have an ATI card around to test this but maybe someone else can do a quick run for us?
post #23 of 80
Good thread. You can never trust what the media tells you.
post #24 of 80
Basically they don't want people to know that its a power hungry card so they disabled Furmark in software.
SBD:
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770k Gigabyte ga-z77x-up4 tb EVGA GTX 980 SC 32GB G.SKILL Trident X F3-1600C7Q-32GTX  
Hard DriveHard DriveOptical DriveCooling
Samsung 840 Pro 256GB SSD Western Digtal 2TB RE4 Plextor 24x DL Burner ThermalTake Water 2.0 Extreme 
OSMonitorMonitorKeyboard
Windows 7 Pro 64-bit Acer XB270HU 2560x1440, IPS-type panel, 144hz, ... EIZO FG2421 1920x1080 VA 120Hz QuickFire Cherry Blue 
PowerCaseMouseMouse Pad
Lepa G1000 1kw Corsair 550D Logitech G400 Roccat Taito 
AudioAudio
Xonar Essence ST Niles SI-275 Amplifier 
  hide details  
Reply
SBD:
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770k Gigabyte ga-z77x-up4 tb EVGA GTX 980 SC 32GB G.SKILL Trident X F3-1600C7Q-32GTX  
Hard DriveHard DriveOptical DriveCooling
Samsung 840 Pro 256GB SSD Western Digtal 2TB RE4 Plextor 24x DL Burner ThermalTake Water 2.0 Extreme 
OSMonitorMonitorKeyboard
Windows 7 Pro 64-bit Acer XB270HU 2560x1440, IPS-type panel, 144hz, ... EIZO FG2421 1920x1080 VA 120Hz QuickFire Cherry Blue 
PowerCaseMouseMouse Pad
Lepa G1000 1kw Corsair 550D Logitech G400 Roccat Taito 
AudioAudio
Xonar Essence ST Niles SI-275 Amplifier 
  hide details  
Reply
post #25 of 80
Quote:
Originally Posted by Mygaffer View Post
Basically they don't want people to know that its a power hungry card so they [neutered] Furmark in software.
"You only get one chance to make a first impression."

Obviously Furmark temps and Vantage scores are not examples of real world gaming performance and should not be used soley to decide your next card. On other forums I have heard many times how the 580 runs 15-20 degrees cooler, uses less power than a 470, and has 'unlocked' power shown in its Vantage score.

To see that ATI has done Furmark trickery as well shows you how much these companies will stoop to boost perception as well as performance. In a world that increasingly values the artificial over the real, maybe we should not be surprised.
Edited by Draygonn - 11/13/10 at 3:12pm
VR machine
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7 4790k  ASUS Z97I-Plus mITX ASUS GTX 1080 Ti Turbo 8G EVGA 1600 C9 
Hard DriveCoolingOSMonitor
Samsung Spinpoint 1TB Corsair H60 Windows 8.1 64 bit Sony HW45ES Projector w 120" Acoustically Trans... 
MonitorKeyboardPowerCase
VR- HTC VIVE  Razer Turret EVGA Supernova 550 G2 Corsair 250D 
AudioAudioAudio
Yamaha RX-V781 Emotiva A-700 and Airmotiv Speakers Dual SVS SB1000 Subs 
  hide details  
Reply
VR machine
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7 4790k  ASUS Z97I-Plus mITX ASUS GTX 1080 Ti Turbo 8G EVGA 1600 C9 
Hard DriveCoolingOSMonitor
Samsung Spinpoint 1TB Corsair H60 Windows 8.1 64 bit Sony HW45ES Projector w 120" Acoustically Trans... 
MonitorKeyboardPowerCase
VR- HTC VIVE  Razer Turret EVGA Supernova 550 G2 Corsair 250D 
AudioAudioAudio
Yamaha RX-V781 Emotiva A-700 and Airmotiv Speakers Dual SVS SB1000 Subs 
  hide details  
Reply
post #26 of 80
Quote:
Originally Posted by Open1Your1Eyes0 View Post
Unfortunately, I don't happen to have an ATI card around to test this but maybe someone else can do a quick run for us?
For all we know.
post #27 of 80
I had this figured long time ago brahs.
But the cooler is a bit better than the 480 an is much quieter.
The Liberator
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core I5-750 @4Ghz (191x21) w 1.40v IMC @1.3v 24/7 ASUS P7P55D-E Pro SATA3 USB3.0 1002 BIOS EVGA GTX 580 @900/1800/2100 with 1.1v 4 GB GSkill Trident DDR3 2000 @1910 8-8-8-21 1T 
Hard DriveOptical DriveOSMonitor
2 Samsung 1TB F3s in 2 RAID0 arrays LG 10X Blu Ray Drive Windows 7 Ultimate 64 (Fully Tweaked) Samsung T260 26" LCD 1920x1200 5ms 
KeyboardPowerCaseMouse
I-ROCKS KR-6820E-BK Back-lit Gaming Keyboard Corsair AX 850 Gold CoolerMaster ATCS 840 Black +Silverstone CFP51 bay Logitech G5 original 
  hide details  
Reply
The Liberator
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core I5-750 @4Ghz (191x21) w 1.40v IMC @1.3v 24/7 ASUS P7P55D-E Pro SATA3 USB3.0 1002 BIOS EVGA GTX 580 @900/1800/2100 with 1.1v 4 GB GSkill Trident DDR3 2000 @1910 8-8-8-21 1T 
Hard DriveOptical DriveOSMonitor
2 Samsung 1TB F3s in 2 RAID0 arrays LG 10X Blu Ray Drive Windows 7 Ultimate 64 (Fully Tweaked) Samsung T260 26" LCD 1920x1200 5ms 
KeyboardPowerCaseMouse
I-ROCKS KR-6820E-BK Back-lit Gaming Keyboard Corsair AX 850 Gold CoolerMaster ATCS 840 Black +Silverstone CFP51 bay Logitech G5 original 
  hide details  
Reply
post #28 of 80
post #29 of 80
Quote:
Originally Posted by Trigunflame View Post
Doesn't matter.

If you care about (Real World) power draw and heat generation for testing stable clocks, run crysis, metro, vantage, 3dmark11 when it comes out, etc.

Furmark is as unrealistic for testing "Real" load conditions on a GPU as Linpack is for the CPU.
I agree with no big issue with furmark, we should've been using real world situations such as gaming or something anyway.

But what I'm still trying to understand is is this software-based (such as what AMD/ATI has done where it will downclock itself to look better on temps) ie is this only done when the driver detects OCCT/Furmark or is this a feature where if ANYTHING (ie, Crysis 2 for example) draws too much power from the 580 it will automatically downclock.

BUt as it stands now where NVIDIA claims no gaming affected, as of yet, I was expecting something like Metro to draw some big power since its FPS seems to be the lowest and hence most demanding game...what can we do to ensure that the 580 is not being auto-gimped?

I am very concerned since I am seriously contemplating a 580 (or a 6970, will wait until 6970 comes out) to replace my current setup and this seems quite concerning since now we cannot be guaranteed that the GPU is running at its full capacity when under heavy stress (ie, gaming).

Quote:
Originally Posted by Caleo View Post
I think it's a little foolish to get all bent out of shape at nvidia & each other for something so trivial. Don't you guys have anything better to do?
This should be concerning to all of us as Open have pointed to AMD/ATI doing same thing. Basically its auto-gimp of the GPU when it senses it goes over a certain set threshold of temp/power draw...and while they claim there is no effect on gaming right now, how do we know for sure that these new GPUs (ATI and NVIDIA) are not performing their best under heavy gaming conditions? Definitely more testing should be done.

Quote:
Originally Posted by Open1Your1Eyes0 View Post
ATI actually did this two years ago using Catalyst.
Damn. Is this still going on? Seems like someone mentioned Catalyst gimps the cards when it recognizes furmark/occt being run where as the NVIDIA cards seems to have another mechanism in place to gimp performance if a certain power/temp threshold is reached. I am seriously contemplating either a 580 or a 6970 but if either or both cards have these features then it seems very concerning.

Quote:
Originally Posted by Open1Your1Eyes0 View Post
Anyone else experiment with this yet? Would love to see more results.
Indeed. I'm surprised this thread hasn't gotten any attention. Does any other review site know about this other than Fud?

Quote:
Originally Posted by Open1Your1Eyes0 View Post
Apparently NVIDIA (just like ATI) caught on with reviewers "overstating" the power consumption and heat their previous cards produced due to Furmark being "too intensive" and they decided to mascarade the Furmark results this time (pointing out that Furmark doesn't accurately represent the real world results, which is what most people care about). Personally, I agree, this was a pretty poor decision (from both companies). However, I will give NVIDIA credit for at least not-hiding the fact that this limitation exists and in fact pointing it out in their slides. Unfortunately back when ATI did this, they really said nothing about it (not really too much concern there though cause the workaround was just stupid easy - just renaming the Furmark.exe filename).
Indeed. But this is quite conerning as both companies are doing this. Again, I'm still not sure what the AMD/ATI situation is (only furmark/occt triggered or can it also be triggered by pure temp/power draw threshold) and as I understand from your OP it seems that the 580 will auto-gimp based on furmark/occt as well as high power-draw situations....ie maybe during an intense scene during Crysis 2....hence it could lead to a lower minimum FPS?

Quote:
Originally Posted by Open1Your1Eyes0 View Post
Unfortunately, I don't happen to have an ATI card around to test this but maybe someone else can do a quick run for us?
Indeed. When I'm out of this Dallas hotel (need to take care of some school stuff), I'll try to see what I can do to test my little 5850s.

It seems I have to plug in a 2nd monitor and monitor the clocks while running furmark/occt/gaming/etc. on the main monitor....
The Trooper
(14 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 2600K Gigabyte P7A-UD7-B3 GTX 660 Ti Corsair XMS 16GB 
Hard DriveCoolingOSMonitor
Crucial C300 64GB, WD Caviar Black 1TB, Samsung... Corsair H70 Windows 7 Hanns·G 28"  
KeyboardPowerCaseMouse
Lenovo Antec HCP-1200 Thermaltake Armor VA8000 Logitech G500 
  hide details  
Reply
The Trooper
(14 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 2600K Gigabyte P7A-UD7-B3 GTX 660 Ti Corsair XMS 16GB 
Hard DriveCoolingOSMonitor
Crucial C300 64GB, WD Caviar Black 1TB, Samsung... Corsair H70 Windows 7 Hanns·G 28"  
KeyboardPowerCaseMouse
Lenovo Antec HCP-1200 Thermaltake Armor VA8000 Logitech G500 
  hide details  
Reply
post #30 of 80
Quote:
Originally Posted by nist7 View Post
Damn. Is this still going on? Seems like someone mentioned Catalyst gimps the cards when it recognizes furmark/occt being run where as the NVIDIA cards seems to have another mechanism in place to gimp performance if a certain power/temp threshold is reached. I am seriously contemplating either a 580 or a 6970 but if either or both cards have these features then it seems very concerning.
Find out with your 5850's. It's simple as renaming furmark.exe to something else.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › Truth on GTX580 Furmark Temps and Consumption