Originally Posted by Nostrano
But would it really be worth the additional power consumption?
*Current power consumption is 800w.
*Let's say he did a mild overclock on the Q6600 of 2.4GHz to 3GHz.
*We'll say the increase power consumption from each quad is 50w so that means a 300w increase or 1100w for the entire system.
The cluster was able to render 24 frames in 4k format in 64 min. With the 600MHz increase, it could the same amount of work in about 52 min.
Therefore, it was costing: 64min/60 * 800w/1000 = .85kW/hr
Overclocked, it would cost: 52min/60 *1100/1000 = .95kW/hr
Assuming the cost of a kW/hr is $.10.
Therefore, he is spending 1 cent more to save 10-12 mins.
(I hope my math is right!
Originally Posted by Brutuz
If he got two 1000watt PSU's he'd be able to overclock them and add a lot more motherboards to it.
He would have to be wary of the 3.3v and 5v rail draw with all that memory and those HDs. Newer PSUs provide more power to the +12v rail.Edited by DuckieHo - 6/11/08 at 9:29am