Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Cooling › GPU aftermarket cooler
New Posts  All Forums:Forum Nav:

GPU aftermarket cooler

post #1 of 4
Thread Starter 
Ok so my Dad has been complaining about how the GTX 570 he has is heating up the room, so I'm going to get him an aftermarket cooler of some kind. Need some suggestions as I am kind of new to aftermarket coolers.
Black Jarvis V1
(22 items)
 
Mazdaspeed6
(18 items)
 
 
  hide details  
Reply
Black Jarvis V1
(22 items)
 
Mazdaspeed6
(18 items)
 
 
  hide details  
Reply
post #2 of 4
it's going to cost you about 100$ for a new cooler.
you would be better off selling the card and getting something new
post #3 of 4
Wow wow wow...concept alarm:

Heat vs. Temperature...
Heat = energy released by your card
Temperature = the operating temperature of the card or the room or w/e.

Heat released by your card is the same, and w/e cooler you get, it will help you transfer it in the room's environment.

A "better" cooler will take those say 200Ws of heat and dump them in the room keeping the card @ 60oC.
The "stock" cooler will take those 200Ws and won't be able to dump them in the room before the card reaches 85oC.

Either way, you will be heating up the room with 200Ws. Nothing will change other than the cards temperature. It will be running "cooler", but it will be producing and dumping in the room the same heat.

As a result the room will be heated up just the same, and just as fast (under identical workloads). The room's temperature rise will be the same, as the added load (200W) is the same.

Is your dad gaming a lot? What is he using the card for?
Get him a 650 or something with less wattage and ok performance, otherwise there is no way to release less heat in the room. Fight the source (it's the card, not the cooler).
Edited by pcfoo - 8/23/13 at 10:15am
FooBox
(20 items)
 
 
Camera Gear
(10 items)
 
CPUMotherboardGraphicsRAM
Xeon E5-1650 V3 3.5GHz Dell C610 Quadro K4200 4GB 32GB PC-2133 DDR4 
Hard DriveCoolingOSMonitor
Crucial M550 256GB Dell Windows 7 Professional Dell U2412M 
MonitorKeyboardPowerCase
Dell U2412M Dell KB-212-B Dell 850W 80+ Gold T5810 
Mouse
Logitech G700 
CPUGraphicsGraphicsGraphics
Canon EOS 6D  EF 24-70mm f/2.8L II USM  EF 16-35mm f/4L IS USM EF 70-200mm f/4L IS USM 
GraphicsOptical DriveOptical DriveOptical Drive
EF 35mm f/2 IS USM Manfrotto 190XPROB Tripod Manfrotto 685B NeoTec Monopod Manfrotto 410 Junior Geared Head 
Optical DriveCase
Manfrotto 496 Ball Head Lowepro Mini Trekker AW 
  hide details  
Reply
FooBox
(20 items)
 
 
Camera Gear
(10 items)
 
CPUMotherboardGraphicsRAM
Xeon E5-1650 V3 3.5GHz Dell C610 Quadro K4200 4GB 32GB PC-2133 DDR4 
Hard DriveCoolingOSMonitor
Crucial M550 256GB Dell Windows 7 Professional Dell U2412M 
MonitorKeyboardPowerCase
Dell U2412M Dell KB-212-B Dell 850W 80+ Gold T5810 
Mouse
Logitech G700 
CPUGraphicsGraphicsGraphics
Canon EOS 6D  EF 24-70mm f/2.8L II USM  EF 16-35mm f/4L IS USM EF 70-200mm f/4L IS USM 
GraphicsOptical DriveOptical DriveOptical Drive
EF 35mm f/2 IS USM Manfrotto 190XPROB Tripod Manfrotto 685B NeoTec Monopod Manfrotto 410 Junior Geared Head 
Optical DriveCase
Manfrotto 496 Ball Head Lowepro Mini Trekker AW 
  hide details  
Reply
post #4 of 4
Quote:
Originally Posted by pcfoo View Post

Wow wow wow...concept alarm:

Heat vs. Temperature...
Heat = energy released by your card
Temperature = the operating temperature of the card or the room or w/e.

Heat released by your card is the same, and w/e cooler you get, it will help you transfer it in the room's environment.

A "better" cooler will take those say 200Ws of heat and dump them in the room keeping the card @ 60oC.
The "stock" cooler will take those 200Ws and won't be able to dump them in the room before the card reaches 85oC.

Either way, you will be heating up the room with 200Ws. Nothing will change other than the cards temperature. It will be running "cooler", but it will be producing and dumping in the room the same heat.

As a result the room will be heated up just the same, and just as fast (under identical workloads). The room's temperature rise will be the same, as the added load (200W) is the same.

Is your dad gaming a lot? What is he using the card for?
Get him a 650 or something with less wattage and ok performance, otherwise there is no way to release less heat in the room. Fight the source (it's the card, not the cooler).
This.

You would have to be evacuating the warm air from the case to the outside if you want the computer's heat output to stop heating up the room. Either that or crank up the AC which is essentially removing the heat from the house for you. The only way around this is to get a more efficient CPU/GPU that don't require as much energy and thus create more waste heat. That is more reliant on technological progression though so that's generally easier said than done.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA Cooling
Overclock.net › Forums › Graphics Cards › NVIDIA › NVIDIA Cooling › GPU aftermarket cooler