Originally Posted by NorKris
So Guys, i have a question for u... my Brother claims that if his GPU / Or ANY GPU hits 50-60c while Running a game he will notice a performance drop, and that this is normal... whaytu guys think of that?
Wrong forum ... but it's an interesting question? ... Also hard to answer with your lack of details, ie. GPU model? (H2o?) max overclock or stock? How much and/or what kind of a performance drop? etc.... 50-60c I find arguable unless he's running Max clocks w/H2o ...
Here's what I know ... with my reference 570 and all other previous Nvidia GPU's I've owned (built), I did see a "measurable" performance drop in OCCT GPU Tests for errors when finding my MAX stable clocks, when temps rose above 65c (sometimes 70c) and I always set my fan profiles to try and prevent this. The newer GPU's are made to run Hot, 85c all day long and be stable, but I also so a slight (1-2FPM) drop in Crysis Bench(s) when the temps got that high, and also a very slight drop in Unigine Heaven. Additionally, at max clocks I could see minor artifacting in Furmark when the temps got to 80-85c as opposed to 65c ...
So depending on his setup
... I think that someone who really understands his components and overclocking "could" notice some performance degradation but nothing I'd worry about at those temps if your just measuring FPS
EDIT: ... ooops walked away from the screen and didn't see all the answer's ...
which I "obviously" agree with NICE LINK
Justin, I'm smarter than I thought I was ... HeeHee Edited by TomcatV - 5/18/13 at 11:36am