Overclock.net banner

1 - 4 of 4 Posts

·
Registered
Joined
·
2 Posts
Discussion Starter #1
I have a Radeon HD 7950 and over to 1100/1575. Default is 900/1250
How much energy (W) consume more? Can be in percentage.
 

·
Registered
Joined
·
2,746 Posts
Would need to know volts, not clocks of your card. Even then it can be tricky as the card wont stay 100% load, and therefore highest voltages, all the time. Just pick up a kill-a-watt meter and plug it into your wall, then plug computer into that. Measure at stock and your OC. That will give you a real accurate measurement, which you can then check while gaming, idle and basic use, as your power consumption will vary depending on your tasks.
 

·
High Clocker
Joined
·
3,439 Posts
Kill-A-Watt is the only way to tell.

But for reference sake on my old 8350 with 7970 CF.

8350 @ 5ghz
Both GPUs at stock.
650W from the wall during 3dmark 11 (peak)

Both GPUs at 1300mhz/1500mhz with 1.3V
920W from the wall during 3dmark 11 (peak)

So there was a 270W gain overclocking the gfx cards.
135W extra per card.
Normal gaming that would probably be closer to 100W per card.

Your overclock is a lot less so im guessing be more like 50W extra usage with a slight voltage bump, if your pumping 1.3V into it then it might be more like 60W
 
1 - 4 of 4 Posts
Top