Overclock.net banner

1 - 14 of 14 Posts

·
Registered
Joined
·
1,558 Posts
Discussion Starter · #1 ·
On a group that im in online, someone said this
Quote:
Maybe someone can shed some light for me...Here's my system:

Mobo: Gigabyte UD7 Rev. 3
CPU: FX-9370 @ 1.476v 4.6Ghz
GPU: (2) R9 280x
RAM: Kingston HyperX 2133Mhz 4GB
PSU: NZXT 1000w

at system completely Idle, watt meter is showing 205w

GPU's at full load pulling 770ish watts

Now, with full load on both CPU/GPU's, my total wattage drops to around 650-670w. There are no ripples on the PSU that drops below specs while running tests and connected to an Oscope.
At first glance, I was thinking 1000w was overkill, and maybe 750w wouldve been perfect,
he also said his watts meter is showing values of 770ish or 650-670.

how is this possible?

i thought systems dont take up that much power?

or are the values right?
(since he has xfire, and im assuming OC'd)

EDIT: i asked and he only OC'ed CPU
 

·
Registered
Joined
·
821 Posts
Quote:
Originally Posted by IMKR View Post

On a group that im in online, someone said this
At first glance, I was thinking 1000w was overkill, and maybe 750w wouldve been perfect,
he also said his watts meter is showing values of 770ish or 650-670.

how is this possible?

i thought systems dont take up that much power?

or are the values right?
(since he has xfire, and im assuming OC'd)

EDIT: i asked and he only OC'ed CPU
Depending on load and overclocking, those GPUS could easily pull 250-270 watts each....



http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635-18.html

That CPU, when overclocked... can easily pull 250-300 watts plus... (or so I have read!). Looking for some tests showing that.

So yea.. that power consumption sounds possible.
 

·
Getting used to 'new'
Joined
·
5,126 Posts
You need to measure in PSU efficiency. Even if it's 80%, it's actually ~600W
 

·
Registered
Joined
·
1,558 Posts
Discussion Starter · #5 ·
Quote:
Originally Posted by AcEsSalvation View Post

You need to measure in PSU efficiency. Even if it's 80%, it's actually ~600W
???

im taking multiple meanings out of this. what part are u refering too?
 

·
Go Again!
Joined
·
5,871 Posts
Quote:
Originally Posted by IMKR View Post

???

im taking multiple meanings out of this. what part are u refering too?
so if the pc is using 480 watts that means a 80% efficient psu is actually pulling 600 watts at the wall.
 

·
Premium Member
Joined
·
6,021 Posts
Quote:
Originally Posted by IMKR View Post

???

im taking multiple meanings out of this. what part are u refering too?
The efficiency factor now determines how much the power supply absorbs from the electrical outlet and dissipates into heat. If the power supply in this example has an efficiency of 80%, this means 20% of the power taken from the outlet is wasted. So while the PC uses 600watts, the power meter shows 750 watts (since 750 * 80% = 600).

AcEsSalvation math was incorrect.

Quote:
Originally Posted by IMKR View Post

dam, team red isnt really power friendly -.-
One single Titan draws near 500w when OC'd.
thumb.gif

With heavy OCs, efficiency goes out the window.
 

·
Getting used to 'new'
Joined
·
5,126 Posts
I rounded down by the way, however the point is that people see figures like this then tell others that these cards are drawing an insane amount of power. Gives a bad name when it happens, and causes product loyalty.
Also about the Titan, same power draw as any other 8+6 pin card. Always remember PSU efficiency and the connectors.
 

·
Premium Member
Joined
·
6,021 Posts
Quote:
Originally Posted by AcEsSalvation View Post

I rounded down by the way, however the point is that people see figures like this then tell others that these cards are drawing an insane amount of power. Gives a bad name when it happens, and causes product loyalty.
No you did 800 / 20= 600. Its 750 * 80% = 600. Its not the same.
Quote:
Also about the Titan, same power draw as any other 8+6 pin card.
Incorrect. Not every GPU with 8+6 pin pull 500 watts when OCd. LOL. Titans and just a couple other single gpu GK110 have voltage unlocked up to 1.600v.
 

·
Registered
Joined
·
1,558 Posts
Discussion Starter · #10 ·
so i understand how you got 600w power draw.

but where did the 750w come from? (to get 750w * 80%)

or was it just a hypothetical value for an example?

i understand about the wasted heat, efficiency and such,

but i dont understand where u got the value from (from my first post???)

so are u guys saying that the NZXT 1000w isnt needed still?

and that even tho its "GPU's at full load pulling 770ish watts" its actually 616w ? (770 x 80%)

so a 750w PSU would be good enough?
 

·
Premium Member
Joined
·
6,021 Posts
Quote:
Originally Posted by IMKR View Post

so i understand how you got 600w power draw.

but where did the 750w come from? (to get 750w * 80%)

or was it just a hypothetical value for an example?

i understand about the wasted heat, efficiency and such,

but i dont understand where u got the value from (from my first post???)

so are u guys saying that the NZXT 1000w isnt needed still?

and that even tho its "GPU's at full load pulling 770ish watts" its actually 616w ? (770 x 80%)

so a 750w PSU would be good enough?
The 750w was the corrected math of AcEsSalvation's post. 750w at the wall. 600w directly from the PSU. Yes a 750 watt PSU will be fine.
 

·
Getting used to 'new'
Joined
·
5,126 Posts
Quote:
Originally Posted by Swolern View Post

No you did 800 / 20= 600. Its 750 * 80% = 600. Its not the same.
770 * .8 = 616
Like I said, I rounded down. How would you know if I did 800/20 = 600? That isn't even correct...
doh.gif
 
1 - 14 of 14 Posts
Top