post #1 of 1
Thread Starter 
Is it worth cooling your GPUs, or should you run them hot? In this session, we discuss how operating temperature affects the computational performance of GPUs. Temperature-dependent leakage current effects contribute significantly to power dissipation in nanometer-scale circuits; within GPUs this corresponds to decreased performance per watt. We use the CUDA-based xGPU code for radio astronomy to benchmark Fermi and Kepler GPUs while controlling the GPU die temperature, voltage, and clock speed. We report on trends and relate these measurements to physical leakage current mechanisms.
Danny Price ( Postdoctoral Fellow, Harvard-Smithsonian Center for Astrophysics )