Originally Posted by PhantomGhost
This conversation has gotten very interesting. I think most everyone is right on this topic, its just some people are talking about different things is all. Sooo I figure I will throw my 2 cents in as well.
Going off what Mega Man said, I think this is where the latest confusing/misunderstanding is. Bond, you keep saying voltage is the key, voltage is the key, when that isnt true, WATTAGE is the key. A CPU under water putting out 125 watts and a CPU on air putting out 125 watts will indeed be putting out the same amount of heat, but like has been stated over and over, the water setup will hold the energy much better, dispersing it through the room at a much slower rate. So if the CPU or GPU isn't running at its max continuously, but instead for say, just 3 hours, like Red suggest, then yes, the same amount of heat is being put out over time, but the temperature in the room will not rise as much because the heat is being slowly dispersed over a longer period of time, compared to air.
What Mega is saying, is that a CPU under water, since heat is transferred from it so much more effectively, requires less amperage to run the same speeds because the CPU is more efficient at the lower temps. So while it may have the same voltage, it requires less amperage to be stable, and therefore puts out less wattage. A CPU on air might be at 1.45v pulling 90 amps (130.5 Watts) because of resistance from the high heat, while a water cooled CPU could be at 1.45v only pulling 80 amps (116 Watts) since the heat is transferring away so well. I just made up those numbers, they aren't real numbers pulled from a test or anything, just proving my point.
This is my take on it anyway, from an Electrical Engineering side.