I think most people agree that there is a difference in temperature between the input and output of radiator. The question is just how much of a difference is there and how does it affect our system? That is why I in my previous post in this thread wrote that there is hardly any difference because my stance on the subject is that the difference is small and in fact so small that it should not be the dictating factor when deciding loop order.
The following link is to a test that to the best of my understanding empericaly shows that loop order is insignificant:
I have also tried to look for help in the theoretical world of physics (in which I am certainly no expert so excuse me if I make a mistake in my calculations). I have tried to make some assumptions about a typical system and then tried to calculate the theoretical difference between input and output temperature of the radiator.
- All components water cooled is daisy chained and the readiator is daisy chained as well. The radiator is placed after the components (worst case scenario)
- The water cooled components produce 500W of energy that is dumped (might correspond to one serious GFX card, CPU, mother board cooling and pump heat dump)
- Flow rate: 2 GPM (this is a flow many aim at or want to exceed)
- The system is fully loaded and water temperature has reached equilibrium
When the computer is turned on the water is at room temperature. Since there is no difference between radiator temperature and ambient there will be no heat removed by the radiator. However, there is a big difference between CPU temperature and water temperature so a lot of the CPU heat will be removed (and the CPU runs at it coolest possible). Since heat is added and none removed (only at the start) the water temperature will rise. As soon as it does the radiator will start to remove heat. In the beginning less heat is removed than added by the components because heat transfer is liniar to the difference between radiator temperature and ambient and that difference is still small. So water temperature continues to rise. At some point the radiator will remove exactly the same amount of heat as add by the components. If this where not the case the water would eventually start to boil (assuming that heat is only removed at the radiator which I think is fair and conservative assumptions). This is the point I call equilibrium. If the workload changes the equilibrium will change as well. For this discussion the worst case is when the CPU and GPU working out to the max because this will yield the biggest heat dump into the water and hence mean that the biggest heat removal by the radiator which again means the biggest temperature difference between input and ouput.
The heat flow equation:
Q = m * c * dT
- Q = Heat (measured in Joules. 1 joules = 1 Watt second)
- M = mass (in this case the mass of the water going through the radiator in the timeframe we calculate over)
- c = Specific heat or heat capacity per unit mass of a body. This variable is dependent on: the material used (which in this case is water), its temperature and the presure. For water it is close to 4 J/G*C at 15C and 101.325 kPa). I dont think it changes much at other temperatures and preasures but this is where my physics knowledge ends.
- dT = Change in temperature.
We can use this formula to calculate the theoretical change in temperature between input and output of the radiator.
What we know:
- 2 gallons of water passes through the radiator each minute. This corresponds to about 7.5 Liters of water/minute which equals 125ml/second. 125ml of water weights (approximately) 125g. so M = 125g in our formula above
- 500W needs to be removed by the radiator (since 500W is added by the components we know 500W must be removed as well since we are in equilibrium). We know the amount of heat removed is Q = 500 W * seconds
- c = (approx) 4 J/(g*C) = 4 Ws/(g*C) , C = celcius, remember 1 Joules = 1 Ws per definition
- dT = temperature difference we want to compute.
Entering these values into the heat flow equation yields:
500 Ws = 125g * 4 Ws/g*C *dT
500 Ws = 500 Ws/C * dT
dT = 1.0 C
So theoretically there should be a 1 degree difference in temperature between input and output of the radiator. If 1000W of energy is added the difference would be 2C and so forth.
From the above (and assuming I made no mistakes in my application of the formula) I personally conclude:
- There is a difference. However, it is small.
- The difference in input/output is not enough for me (even for a big system) to dictate loop order. I would personally prefer tidyness, simplicty and reduced tubing (easier to bleed, fill, drain...).
- Also look at what happens if you spread out the radiators (typically 1 before CPU, 1 after CPU) between each block. Then the radiators share the work of removing the heat and the difference in input/output temperature of EACH radiator will be less.
- A good question now is: What does a 1C improvement in water temperature do to the CPU tempeature. My guess is very little - a guess that is backed up by the people who says they tried different loop orders without any change in CPU temperature.
- If one insists on the importance of loop order the above indicates you should put all radiators in series and place them right in front of the CPU to get biggest temperature drop before the water hits the CPU block.
Gah. this took quite a while to figure out and write. I hope it will be useful reading
. This is why water cooling is fun...
/NordarEdited by Nordar - 8/22/10 at 3:56am