The thing with dew point is that it doesn’t change with air temperature. It is purely only related to the water content by mass in the air. How close the air temperature is to the dew point basically determines relative humidity. The only way to prevent condensation is to always keep water temperatures above the dew point in the room, or remove moisture from the room air to lower the dew point.
Raising the room air temperature may seem to help, but only because you are increasing the differential temperature between the outer surfaces of your loop and the actual water temperature due to thermal resistance of the materials containing the water. That effect is highly sensitive to localized air convection around the components and nearby radiant heat sources. Your safest bet is to just keep the water temperature above the dew point. You could put a thermostatic 3 way valve to blend radiator supply water with recirculated computer loop water and just put the sensing probe on the supply headed to the computer and set the control temperature of the valve to something safely above average ambient dewpoint. That way you get consistent cooling supply temperature no matter how cold the radiator gets. You could probably do the same thing with an Arduino microcontroller and a temperature sensor controlling some kind of servo valve, and in that case could incorporate an ambient humidity sensor for a dynamic water temperature setpoint that raises and lowers with ambient dew point changes.
A forum community dedicated to overclocking enthusiasts and testing the limits of computing. Come join the discussion about computing, builds, collections, displays, models, styles, scales, specifications, reviews, accessories, classifieds, and more!