Overclock.net banner
1 - 20 of 26 Posts

·
Registered
Joined
·
1,194 Posts
Discussion Starter · #1 ·
Well, I've been stuck at it for months now, and I just can't seem to get it right. People have gotten the 285 to O.C. tons on the stock cooler, and I can't even get it to one increment above stock O.C. without something artifacting.
So can somebody detail me completely, how do you overclock a GTX 285? Because I'm all confused. I've got the shader straps, and I use Furmark to test for artifacts.
What procedure should I follow? I've been using RivaTuner.
 

·
Registered
Joined
·
1,202 Posts
Quote:


Originally Posted by grahamcrackuh
View Post

Go to EVGA's website, and download EVGA precision and EVGA GPU voltage tuner! These are fairly simple interfaces that should allow you to easily OC it. Test for artifacts using ATItool.

This. I couldn't have said it better.
 

·
Premium Member
Joined
·
8,051 Posts
I don't think the 285 is compatible with the voltage tuner but get precision for sure.
 

·
Registered
Joined
·
1,194 Posts
Discussion Starter · #9 ·
I know what software, but how exactly do I overclock? What steps?
 

·
Iconoclast
Joined
·
32,307 Posts
I had to underclock my card (BFG 285 OC) with the stock cooler to get it fully stable (flong runs of Furmark). However, It OCs fairly well with my new cooler.

ATiTool is getting pretty dated. Doesn't heat up cards well, and often won't detect artifiacts on obviously unstable speeds.
 

·
Registered
Joined
·
1,194 Posts
Discussion Starter · #13 ·
But there's a shader clock too.
What exactly am I supposed to do? Is there a comprehensive guide on this?
 

·
Registered
Joined
·
55 Posts
Its fairly simple.

Core clock --------()--------
Shader --------()--------
Memory ---------()----------

This will be around what your evga precision looks like expect your core clock and shader will be linked. Always keep your core and shaders linked unless you know what your doing. Move your Core Clock and Shader to the RIGHT 10-20 numbers and push APPLY. So if your stock clocks are say for ex. (650 / 1650 / 900 ) Then your OCd clocks will look like 670 / 1670 / 920.

Also delete rivatuner if you havent already (no offense RT guys). Rivatuner is for a more "power user" crowd. If you want to have a nice looking simple interface, use EVGA precision. Its my favorite anyway.
 

·
Registered
Joined
·
1,194 Posts
Discussion Starter · #16 ·
I use RivaTuner for almost everything GPU related, seeing my GPU stats in realtime on my G15 LCD, controlling fan speed using profiles, forcing drivers to comply with it in driver emulation mode, showing available video memory, etc.
I know how to overclock, just don't know what and how the first steps should be.
People also say artifacts are like a tiny dot or small rectangles here and there, my card just ****ing crashes and displays a blank screen if I go too far.
 

·
Registered
Joined
·
1,194 Posts
Discussion Starter · #18 ·
Quote:


Originally Posted by Angmaar
View Post

Just download GPU Tool. Then press "find max core/memory".

It sorta worked, for a few minutes, I think it passed 666MHz and 678MHz and then a big white screen came up and the thing froze.
 

·
Registered
Joined
·
55 Posts
Quote:


Originally Posted by hackm0d
View Post

I use RivaTuner for almost everything GPU related, seeing my GPU stats in realtime on my G15 LCD, controlling fan speed using profiles, forcing drivers to comply with it in driver emulation mode, showing available video memory, etc.
I know how to overclock, just don't know what and how the first steps should be.
People also say artifacts are like a tiny dot or small rectangles here and there, my card just ****ing crashes and displays a blank screen if I go too far.

You can do that with evga precision also just fyi. As stated above back your card off about 10 mhz or so with gpu tool and you should be fine. Assuming temperatures and artifacting arent a issue. Good luck.
 

·
Registered
Joined
·
600 Posts
I have had a quite interesting experience overclocking my BFG GTX 275. I picked it up the other day, uninstalled and driver cleaned my ATI drivers in safe mode, proceeded to install the 186.18 WHQL driver as well as EVGA precision and Furmark, Techpowerup's GPU test and OCCT 3.1. I raised up the clocks, tested, raised, tested until I was at 735/1584/1291 (648/1440/1152 stock) and found that it was perfectly stable in Furmark for over 30 minutes straight (~90C temperature with auto fan settings).

Next, I tested it on the other two stress testers, and the clocks functioned fine in both. Then I gleefully launched TF2, joined a server and crashed. So I tried it again, and crashed again. For a while I was a little put off by this fact... how could TF2 stress the card worse than TORTURE tests? So I tried a few other games. Crysis crashed, Company of Heroes crashed, CoD4 crashed, everything crashed at clocks that were perfectly stable in torture testing.

So I decided to drop them back to stock to try it and found it was perfectly stable, not a driver issue or bad card or anything. Next I decided to use CoD4 and Mirror's Edge as my stress testing games. I kept finding setups that would function in Crysis and TF2, but for some reason not in those two games. Finally I settled on my shader clock being the big limiter as I couldn't bring it higher than 1506 in EVGA precision without crashes. I settled on 717/1506/1248.

Now, I still can't for the life of my understand why clocks that are stable in far more taxing tests than any game on the market are not stable in games as trivial as TF2 and CoD4. Anybody have any ideas? Power supply doesn't make sense, neither does a bad card, I tried multiple different driver versions, different OC utilities, nothing worked besides dropping them back.
 
1 - 20 of 26 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top