Absolutely. It's actually major overkill.
Check this out:
Guru3D did a review on three GTX 680's in order to test SLi and 3-way SLI. One of the things they did in this review is measure the power consumption in three different ways (link): first with one GTX 680, then with two in SLi, and then finally with all three in 3-way SLi. For each test, they measured the power consumption at the wall outlet using an intense 3D application in order to only stress the 680(s). They won't reveal which 3D application they're using, but here's what they said about it:
Note: there has been a much discussion using FurMark as a stress test to measure power load. Furmark is so hard on the GPU that it does not represent an objective power draw compared to really hefty gaming. If we take a very-harsh-on-the-GPU gaming title, then measure power consumption and then compare the very same with Furmark, the power consumption can be 50 to 100W higher on a high-end graphics card solely because of FurMark.
We decided to move away from Furmark in early 2011 and are now using a game like application which stresses the GPU 100% yet is much more representable of power consumption and heat levels coming from the GPU. We however are not disclosing which application that is because we do not want AMD/ATI/NVIDIA to 'optimize & monitor' our stress test whatsoever, for our objective reasons of course.
Here's their test system:
Here's a note about their test system:
Our test system is based on a power-hungry Core i7 965 / X58 system. This setup is overclocked to 3.75 GHz. Next to that we have energy-saving functions disabled for this motherboard and processor (to ensure consistent benchmark results). On average we are using roughly 50 to 100 Watts more than a standard PC due to higher CPU clock settings, water-cooling, additional cold cathode lights, etc.
First, I'll show the power draw of their system as measured at their wall outlet for all 3 tests individually. After that, I'll add 200W in order to account for the maximum possible power draw of everything else in the system,, and then I'll multiply each of the 3 by .85 because their PSU is likely to be about 85% efficient.
One GTX 680 in their system under full load resulted in a total system power draw at the wall outlet of 307W.
Two GTX 680s in SLi in their system under full load resulted in a total system power draw at the wall outlet of 473W.
Three GTX 680s in 3-way SLi their system under full load resulted in a total system power draw at the wall outlet of 649W.
Adding 200W to each:
- 507W (one GTX 680)
- 673W (two GTX 680s in SLi)
- 849W (three GTX 680s in 3-way SLi)
These are the theoretical absolute-maximum power draws of their test system at the wall outlet if absolutely everything in it were completely maxed out, including their fans, hard drive, optical drive, etc. However, their PSU is not 100% efficient which means their system would pull less than either 507W, 673W, or 849W from the PSU.
Here is what the system would be pulling from their PSU if it were 85% efficient during each power draw:
- 431W (one GTX 680)
- 572W (two GTX 680s in SLi)
- 722W (three GTX 680s in 3-way SLi)
However, I am being unrealistic because I added 200W which is like saying that everything is completely maxed out including the little things like hard drives, optical drives, fans, memory, etc. Not only that, but it's like I'm saying that both the GPU(s) and the CPU are maxed out to their limits, and that doesn't happen very often (if ever).
So yeah, you could use a quality 750-850W power supply.