Overclock.net › Forums › Graphics Cards › NVIDIA › Need help on forcing 3D clocks on my GTX 460 (to fix stuttering issue)
New Posts  All Forums:Forum Nav:

Need help on forcing 3D clocks on my GTX 460 (to fix stuttering issue) - Page 5  

post #41 of 60
Thread Starter 
Well I have somewhat good news.

Basically Killing Floor's problem is its own problem, not with drivers. I've tried everything, forcing Vsync, changing performance mode, got it to run at like 60-130FPS maxed out, but when I turn my flash light on my gun on in the game, the 460 freaks out and downclocks for some reason, hence why framerate hits unbearably low levels (this, as I've stated numerous times before, has absolutely nothing to do with my CPU). Turns out this game hates some Nvidia GPUs or something as I've read a lot of people with 200 series Nvidia GPUs also had performance problems, and since this game is built around UT2004 engine, it has some unfixed problems that this original game had. Screw this game, I can wait till I get a new PC with probably an AMD card to max it either late this year or early next year.

As for L4D2, it also seems to be the game's problem. First off, the most obvious is the notice I get at the start from Valve (that my card and drivers are unsupported) which is the first sign. Then I played through 2 single player campaigns offline and never got any stuttering whatsoever (despite having spent like 2-3hrs on this), yet played a game of Scavenge soon afterwards and ran into stuttering 10 minutes in. Did the netstat thing and it looks like the packet amount or something spikes around the time the stuttering happens (possibly some net code problem or something Valve screwed up in their updates). I will look into this some more today and tomorrow and post back here. Problem is, the stutter is NOT in any way related to the load on screen and therefore not in any way related to my processor. My CPU can be at like 80% load and even with no zombies on the screen, bam. Out of nowhere I get stutter, which seems to last no more than 10 seconds everytime before going back to normal. When it's normal, I can spam pipe bombs in a finale and although I hit quite a lower than average framerate due to the game's CPU-based physics (lowest I've seen it go was 19FPS, my average is like 57FPS with 16XQ CSAA everything but CPU effects maxed), it never hits unplayable levels like it does with the stuttering (where I hit 5FPS or less for the same set amount of time every 20-30 minutes or so).

Quote:
Originally Posted by brettjv View Post
There's a reason people keep bringing this bottlenecking point up ... and that's because what you said here, is incorrect ... at least in the general sense (not talking about this game specifically, just in general).

The reason the card downclocks to low-power 3d is because not enough stress is being placed on it (i.e. gpu usage drops below a certain point ... in my experience that is usually around 30%). A severe cpu BN, in the sense that it will essentially impose a framerate limiter, can prevent the GPU from running at it's full potential. Basically, the 'load' being placed on the card becomes 'too easy', so the card downclocks.

Like I mentioned above, this happens with mine all the time since my cards are way overpowered for a lot of games, esp. console ports. So when my usage drops down to like 25%, the card(s) will drop to 405MHz, and then the gpu usage jumps up to around 50%. But it's totally seemless on my rig, no stutters, no FPS drops, no problems at all. The only way I even know about it is because I'm watching the OSD in Afterburner.

Have you noticed there's a pattern to when the card downclocks, i.e. does it only happen when the GPU usage is low (use AB OSD to check this, if you aren't already)? If it's happening with the load prior to the downclock at >50% (I say 'prior' to the downclock because the load will jump up when the card downclocks, for obvious reasons), then there is something wrong with the hardware, because that shouldn't happen.

Bottom-line, while the BN explanation may seem to 'not make sense' to you in regards to this particular game, people's 'logic' (in general) here makes total sense.



Just to be clear, I'm not going on record saying 'this problem IS being caused by a CPU BN'. However, this particular evidence against that possibility is ... not compelling. Those cards you speak of are all considerably less powerful than your current card. Less powerful cards are less likely to be bottlenecked by a less powerful CPU.



I understand your frustration, this shouldn't be happening. In nV's defense, I'm pretty sure that there's something about XP that makes it impossible for them to implement the 'switch' to full power mode that you're asking for. I don't think it's 'laziness', IOW. XP can't support a bunch of other stuff, too, like quad-SLI for example.

And I also understand that you're just looking for a way to 'turn it (power-saving) off', you aren't interested in things like the 'underlying problem'. But you are coming off a little rude-sounding toward people who're trying to help you. They are not 'wrong' in suspecting CPU BN being the issue here, because like I said, a CPU BN CAN cause this exact issue.

V-Sync is for sure not the path to a solution, in fact, as you discovered, it will work in the exact opposite direction.

Also ... the 8800GT (nor the 8600GTS) did not have a 'low power 3d' mode. In fact I don't think they even had a 2d mode if memory serves. So you are misremembering what your buddy's 'issue' was. Sorry

My best advice remains what I suggested above: a bios mod to replace the values for the low-power 3d clocks with the values used for full-power 3d clocks so the clocks won't change on low load. You can always flash it back to the original bios when you move the card to your HTPC ... flashing back takes like 1 minute.

That, or update your OS ... because ultimately I think the problem comes down to XP lacking the ability to do what you're asking for.
1. I didn't say it was clocking down to a low-power 3D mode, but to the 2D mode since that was all they had (3D or 2D mode). I don't know if they introduced this with G92 and later only 8000 series cards or certain revision of cards but it happened. The 8600GTS was OEM anyway, don't know about the 8800GT I used since it was my friend's card.
2. Vsync uses more GPU memory which could be why I get worse performance (so it could be running into a framebuffer problem i.e. not having enough VRAM) but other people with more powerful cards and systems than mine run into very similar problems when enabling Vsync on Valve's "unsupported cards". Blame Valve's incompetence for this.
3. No way is the GPU clocking down normal, even on your PC, but at least you have the option to force 3D clocks via drivers, which I don't.
4. As I said before, I wish the problem was that easy (the CPU bottleneck everyone keeps bringing up) but the stuttering I'm getting seems to only be when I'm playing L4D2 online.
5. Sorry, but I'm gonna call BS on the "new OS" to run my card at full clocks comment. The card is entirely dependent on the manufacturer's drivers (as opposed to some generic Windows drivers) and it's already carrying out any power mode switching now (if it didn't have the ability to carry out power switching modes it would just run at full clocks, because if I run this GPU in my PC without drivers installed it runs at full clocks), any GPU-related power saving feature Vista/7 have, XP can have, it will work differently sure, but is still possible and it is down to Nvidia to make this work. The drivers dictate what power mode to run at, and I blame Nvidia entirely for this since they can very well add the option to XP if they wanted to.

Thanks for the thought out reply, nevertheless.



Quote:
Originally Posted by [T]yphoon View Post
now that i think about 3D profile
i can set my 2D profile the same as 3D profile, and it forces the 2D profile to run at max everything without powersaving
program is called NvidiaInspector 1.9.5.5 (its in the overclock settings)
one thing, you can't overclock the voltage, so that one stays at the 2D profile (mine is at a minimum of 0.913volt and in 3D i can change it from 0.913 to 1.10volt)
Thanks for this suggestion, but unfortunately the card seems to ignore any OC or profile switch I make in this program (same problem as the one I had in Afterburner). The Fermis seem to have a mind of their own in regards to clocks .

Quote:
Originally Posted by nukefission View Post
+1 to bottleneck
I had a 9800GT on a 3500+ @ 3Ghz and prototype was 3-15 fps
same 9800GT on a Q8300 gave me 50+fps at the same res(1440x900)
That game's requirements well exceed those of my or the PC you were using (at least CPU-wise). And the game is a console port, so that's not surprising. But like I said, if I had gotten poor performance in games all around then I wouldn't be complaining as that would indicate a bottleneck. Like when I tried to run Darksiders on my PC, it's playable, but it runs in slow-mo most of the time and is laggy during intense boss fights and stuff. This I can understand. But the 2 games I originally mentioned I ran perfectly before on the exact same rig with older graphics cards, which indicates that this has nothing to do with my CPU (since I've already logged it during stuttering, and the only indicator of something wrong is the card clocking down). And this wouldn't explain why I can max Mass Effect 2 demo flawlessly, same with Crysis on all High settings (DX9) and 4XAA and Call of Juarez Bound In Blood.

I'm gonna leave some feedback on their drivers to fix this in the mean time, I'm not sure I'm ready to risk bricking my working card due to Nvidia's fixable driver problem.
Edited by Am* - 4/23/11 at 9:49am
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
post #42 of 60
Brettjv's suggestion to modify the BIOS worked perfectly for me. 2D clocks stay at 51, and anything 3D stays at 715 where I want it.

Used Nibitor and Gigabyte's BIOS flasher, it was super easy, although I did have to go in and manually modify the version tag inside the file or the flasher refused to flash since it thought it was the same BIOS I already had. If you're using nvflash it won't be an issue.
post #43 of 60
Thread Starter 
Quote:
Originally Posted by ratbuddy View Post
Brettjv's suggestion to modify the BIOS worked perfectly for me. 2D clocks stay at 51, and anything 3D stays at 715 where I want it.

Used Nibitor and Gigabyte's BIOS flasher, it was super easy, although I did have to go in and manually modify the version tag inside the file or the flasher refused to flash since it thought it was the same BIOS I already had. If you're using nvflash it won't be an issue.
Can you please tell me how you did this? Nibitor says it can't find clock rates for my card, do I need to boot this thing?
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
post #44 of 60
I pulled my BIOS with GPUZ and Nibitor showed it as corrupt, so I didn't use that one. I just used the F3 BIOS I downloaded from Gigabyte as a starting point. For your card, you'll probably want an MSI BIOS, http://www.techpowerup.com/vgabios/7...68.100622.html if I'm not mistaken.

Open that in Nibitor and make the clock entries for performance level 1 and 2 the same as level 3. You can safely leave level 0 alone unless the card is actually going into 2D mode during gaming.

You know what, screw it. I'll just do it real quick. The BIOS attached is factory settings for a GTX 460 Cyclone 768MB, except levels 1 and 2 are now full clocks all the time. 2D clocks are unchanged. Voltages for levels 1 and 2 now match level 3 as well, so you needn't worry about that. Nothing else is changed. Use at your own risk, of course.
post #45 of 60
Thread Starter 
Thanks very much man. If this works, I'm gonna rep ya .
Edited by Am* - 4/23/11 at 12:14pm
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
post #46 of 60
Quote:
Originally Posted by Am* View Post
2. Vsync uses more GPU memory which could be why I get worse performance (so it could be running into a framebuffer problem i.e. not having enough VRAM) but other people with more powerful cards and systems than mine run into very similar problems when enabling Vsync on Valve's "unsupported cards". Blame Valve's incompetence for this.
The issue with v-sync as it relates to your symptoms is that it's a frame-rate limiter. Hence using it will lower gpu usage vs. having it off (presuming that the game would normally run at >60 fps). The mechanism used by the card to reduce the clocks to low-power 3d mode is driven by the GPU usage. Thus, using v-sync increases the probability of the card downclocking by making it more likely that the GPU usage will be reduced below the 'trigger' point for downclocking.

And v-sync's impact on vram usage is actually very minimal. Triple-buffering I think can use more memory though if you use it

Quote:
3. No way is the GPU clocking down normal, even on your PC, but at least you have the option to force 3D clocks via drivers, which I don't.
Au contraire. This downclocking, as it happens on my PC, is absolutely normal (and even desirable), because I use v-sync. If it happens without v-sync (or a framerate limiter of some kind), then it is not 'normal'. Without a framerate cap, the only time it 'should' happen is if you're massively CPU bottlenecked.

Like I say, my cards will perform this switch completely seamlessly ... no stutter, no FPS loss, no problems.

Quote:
5. Sorry, but I'm gonna call BS on the "new OS" to run my card at full clocks comment. The card is entirely dependent on the manufacturer's drivers (as opposed to some generic Windows drivers) and it's already carrying out any power mode switching now (if it didn't have the ability to carry out power switching modes it would just run at full clocks, because if I run this GPU in my PC without drivers installed it runs at full clocks), any GPU-related power saving feature Vista/7 have, XP can have, it will work differently sure, but is still possible and it is down to Nvidia to make this work. The drivers dictate what power mode to run at, and I blame Nvidia entirely for this since they can very well add the option to XP if they wanted to.

Thanks for the thought out reply, nevertheless.
Just curious ... what app do you run to positively determine the GPU clocks you're at without any driver installed at all? Every app I've seen that reports GPU clocks in real-time requires the driver to be present.

IOW I'm curious as to how you've determined that the card runs at 100% (i.e. full-power 3d) if you uninstall the driver ... if that is in fact what you were saying?

In truth I'm about 99% certain that the card itself is programmed to sense the GPU load, and do the three different power modes at the hardware level. IOW, I'm pretty sure that it's not the driver that 'creates' the switching mechanism (although apparently the driver can override it, at least in Vista/7).

I'm not going to claim that I'm 100% positive about the NVCP situation with the power mode, but I've READ that the XP OS for one reason or another is not capable of supporting the 'full power' switch in the NVCP. Maybe it's baloney, but that's what I've read.

Quote:
But like I said, if I had gotten poor performance in games all around then I wouldn't be complaining as that would indicate a bottleneck.
I'm afraid that this is ... not logical. CPU bottlenecks are *entirely* situational. Nobody has one all the time, in all the games they play. And they will vary significantly in severity when you do get them.

Whether or not your CPU will act as the limiting factor to performance (i.e. bottleneck) depends on what test (which game/bench) you run, and what settings (AA/Resolution, etc) you run that test at.

In fact, let's say you play a game that's a good combination of CPU and GPU demands like, say, Crysis. And you play it at settings that cause the game to run around 40fps? You can actually switch back and forth between being CPU and GPU bottlenecked thousands of times over the course of a playing session

The idea that you either 'have' or 'don't have' a CPU bottleneck based entirely on your hardware ... is a (common, but) total misunderstanding of the situation. Every frame EVER rendered by your system is a new opportunity for either the CPU or GPU to act as the limiting factor to performance.

Quote:
But the 2 games I originally mentioned I ran perfectly before on the exact same rig with older graphics cards, which indicates that this has nothing to do with my CPU (since I've already logged it during stuttering, and the only indicator of something wrong is the card clocking down).
Got it. But unless I'm reading your post wrong, the problem w/the current GPU is NOT that you're getting lower FPS than what you got with the old cards, but rather that you get stutters because the card downclocks sometimes, right? The *average* performance in terms of FPS is actually better on the new card, is it not?

Now ... those other cards don't downclock, but even if they did, since they're much less powerful cards, they would be much less likely to downclock with a less powerful CPU than your new card would be. This being because it'd be much less likely that a CPU BN would act as a de-facto framerate limiter.

All I'm trying to explain to you is that the suspicion/explanation of a CPU bottleneck still actually fits perfectly based on the scenario you've described. Again, I'm not saying 'that's what it is', I'm saying 'the symptoms fit'. Driver/game 'incompatibility' problems can also cause these symptoms though.

To sum it again:
1) GPU downclocking to low-power 3d is 'by design', and is triggered by the GPU usage dipping below a certain threshold. It *should* be seamless when this happens. FPS should stay the same, it shouldn't stutter, etc. All that's supposed to happen is that the card downclocks, and the GPU load % goes up in response.
2) Any type of frame-rate limiter which prevents the card from running 'full-out' will cause the GPU % usage to be lower. Common examples of framerate limiters are: v-sync, in-game framerate caps, driver compatibility issues with the game, and CPU bottlenecking.

Ergo, downclocking absolutely *can* be caused by a CPU bottleneck.
Edited by brettjv - 4/23/11 at 3:48pm
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
post #47 of 60
Thread Starter 
Quote:
Originally Posted by brettjv View Post
The issue with v-sync as it relates to your symptoms is that it's a frame-rate limiter. Hence using it will lower gpu usage vs. having it off (presuming that the game would normally run at >60 fps). The mechanism used by the card to reduce the clocks to low-power 3d mode is driven by the GPU usage. Thus, using v-sync increases the probability of the card downclocking by making it more likely that the GPU usage will be reduced below the 'trigger' point for downclocking.

And v-sync's impact on vram usage is actually very minimal. Triple-buffering I think can use more memory though if you use it

Au contraire. This downclocking, as it happens on my PC, is absolutely normal (and even desirable), because I use v-sync. If it happens without v-sync (or a framerate limiter of some kind), then it is not 'normal'. Without a framerate cap, the only time it 'should' happen is if you're massively CPU bottlenecked.

Like I say, my cards will perform this switch completely seamlessly ... no stutter, no FPS loss, no problems.

Just curious ... what app do you run to positively determine the GPU clocks you're at without any driver installed at all? Every app I've seen that reports GPU clocks in real-time requires the driver to be present.

IOW I'm curious as to how you've determined that the card runs at 100% (i.e. full-power 3d) if you uninstall the driver ... if that is in fact what you were saying?

In truth I'm about 99% certain that the card itself is programmed to sense the GPU load, and do the three different power modes at the hardware level. IOW, I'm pretty sure that it's not the driver that 'creates' the switching mechanism (although apparently the driver can override it, at least in Vista/7).

I'm not going to claim that I'm 100% positive about the NVCP situation with the power mode, but I've READ that the XP OS for one reason or another is not capable of supporting the 'full power' switch in the NVCP. Maybe it's baloney, but that's what I've read.

I'm afraid that this is ... not logical. CPU bottlenecks are *entirely* situational. Nobody has one all the time, in all the games they play. And they will vary significantly in severity when you do get them.

Whether or not your CPU will act as the limiting factor to performance (i.e. bottleneck) depends on what test (which game/bench) you run, and what settings (AA/Resolution, etc) you run that test at.

In fact, let's say you play a game that's a good combination of CPU and GPU demands like, say, Crysis. And you play it at settings that cause the game to run around 40fps? You can actually switch back and forth between being CPU and GPU bottlenecked thousands of times over the course of a playing session

The idea that you either 'have' or 'don't have' a CPU bottleneck based entirely on your hardware ... is a (common, but) total misunderstanding of the situation. Every frame EVER rendered by your system is a new opportunity for either the CPU or GPU to act as the limiting factor to performance.

Got it. But unless I'm reading your post wrong, the problem w/the current GPU is NOT that you're getting lower FPS than what you got with the old cards, but rather that you get stutters because the card downclocks sometimes, right? The *average* performance in terms of FPS is actually better on the new card, is it not?

Now ... those other cards don't downclock, but even if they did, since they're much less powerful cards, they would be much less likely to downclock with a less powerful CPU than your new card would be. This being because it'd be much less likely that a CPU BN would act as a de-facto framerate limiter.

All I'm trying to explain to you is that the suspicion/explanation of a CPU bottleneck still actually fits perfectly based on the scenario you've described. Again, I'm not saying 'that's what it is', I'm saying 'the symptoms fit'. Driver/game 'incompatibility' problems can also cause these symptoms though.

To sum it again:
1) GPU downclocking to low-power 3d is 'by design', and is triggered by the GPU usage dipping below a certain threshold. It *should* be seamless when this happens. FPS should stay the same, it shouldn't stutter, etc. All that's supposed to happen is that the card downclocks, and the GPU load % goes up in response.
2) Any type of frame-rate limiter which prevents the card from running 'full-out' will cause the GPU % usage to be lower. Common examples of framerate limiters are: v-sync, in-game framerate caps, driver compatibility issues with the game, and CPU bottlenecking.

Ergo, downclocking absolutely *can* be caused by a CPU bottleneck.
The average FPS and especially the maximum I'm getting is well above that which I used to get with my 2900XT/8800GT. If I followed your theory on it being a bottleneck, it wouldn't make any sense why for 99% of the time I play, say L4D2 I get no lag or stutter whatsoever (frame rate is between 45-90FPS like 90% of time)

GPU usage shouldn't be a problem, I run all my games at highest graphical settings with highest AA/AF levels with some exceptions (Crysis for example). Before with say my 8800GT, I could never go past 4XAA in a lot of games, now I can run 16XAA with driver-forced effects on top without the card even breaking a sweat.

You asked what GPU program I ran to check clocks of a card without drivers, and it was just GPU-Z (some old version, don't remember which revision, sorry). It just stated the clock rate and memory, everything else was either blank or unknown from what I remember.

As for how powerful this card is compared to my older GPUs, that is still irrelevant. If the card is not being maxed enough, I put more AA/AF and other effects or force some in drivers.

Since you said it doesn't affect you when you play your games, maybe it's because your GPU doesn't undervolt when it goes to 3D low power mode (the GPU seems to decide this by itself as well). I logged it during a game of Killing Floor and the worst part is when the GPU undervolts itself, then seems to realise the load is too much for it to be in low-power mode and volts/clocks it back up, and this repeats, rendering the game unplayable (pretty much).

I get what you're saying, and had I been talking about this issue in a different game I would consider it to be applicable. Here, in this case, I don't see how it is relevant.

EDIT: as for the XP not being able to run GPUs at full power is incorrect. Windows 7 uses different system files and can use GPUs with loads of shaders MORE EFFICIENTLY than XP due to using a different GPU hardware acceleration technique (which is why some newer games perform better on 7 than on XP with same hardware specs, see Win7 vs XP gaming comparisons for this), but this has nothing to do with setting clock rates. If Nvidia or Microsoft spouted somewhere that XP can't enforce default clock rates then that's complete BS that they want people to believe to have an excuse not to support that operating system any longer to save themselves some money. Whether it's XP, Vista or 7, what clocks the card runs at is entirely dependent on what mode the drivers dictate the BIOS on the card to switch to. Nvidia are already dropping most XP support for OEM notebooks, I bet they're itching to do the same for their retail products.
Edited by Am* - 4/23/11 at 5:26pm
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
post #48 of 60
Thread Starter 
Quote:
Originally Posted by ratbuddy View Post
I pulled my BIOS with GPUZ and Nibitor showed it as corrupt, so I didn't use that one. I just used the F3 BIOS I downloaded from Gigabyte as a starting point. For your card, you'll probably want an MSI BIOS, http://www.techpowerup.com/vgabios/7...68.100622.html if I'm not mistaken.

Open that in Nibitor and make the clock entries for performance level 1 and 2 the same as level 3. You can safely leave level 0 alone unless the card is actually going into 2D mode during gaming.

You know what, screw it. I'll just do it real quick. The BIOS attached is factory settings for a GTX 460 Cyclone 768MB, except levels 1 and 2 are now full clocks all the time. 2D clocks are unchanged. Voltages for levels 1 and 2 now match level 3 as well, so you needn't worry about that. Nothing else is changed. Use at your own risk, of course.
Hmm I got a bit of a problem. I tried copying my card's BIOS into NiBitor to edit the clocks and it says it can't detect clockrates for my card. Even if I use the .bin file I pull off of GPU-Z or the .rom file from NiBitor itself, and even if the integrity is green (all good), only settings I can see are fan speeds. I can't trust a program that can brick my card when it's not even reading the BIOS settings properly, sorry. I think I'm gonna have to wait till their next revision.
Edited by Am* - 4/24/11 at 3:33am
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel Celeron G530 Haven't decided yet. GTX 460 768MB 4GB Ripjaws 
Hard DriveOSMonitorPower
Samsung F3 1TB (SSD coming later) XP/Windows 7 Same old... Yet to decide. 
Case
Fractal Design Define R3 
  hide details  
post #49 of 60
Quote:
Originally Posted by Am* View Post
Hmm I got a bit of a problem. I tried copying my card's BIOS into NiBitor to edit the clocks and it says it can't detect clockrates for my card. Even if I use the .bin file I pull off of GPU-Z or the .rom file from NiBitor itself, and even if the integrity is green (all good), only settings I can see are fan speeds. I can't trust a program that can brick my card when it's not even reading the BIOS settings properly, sorry. I think I'm gonna have to wait till their next revision.
He posted a modified BIOS for you, you don't even need to modify anything. Just use his BIOS and nvflash and you're good to go
    
CPUMotherboardGraphicsRAM
Core I7 920 D0@4.3Ghz 1.36v 24/7 HT on Foxconn Blood Rage EVGA GTX590 Classified HC 6GB Mushkin DDR3@8-8-8-19 1640 
Hard DriveOSMonitorPower
64GB Kingston v100 SSD/500GB Seagate+500GB WD R... Windows 7 Ultimate x64 NEC EA232WMi+2xemachines 5760x1080 surround Cooler Master 850w 
Case
Corsair 800D 
  hide details  
    
CPUMotherboardGraphicsRAM
Core I7 920 D0@4.3Ghz 1.36v 24/7 HT on Foxconn Blood Rage EVGA GTX590 Classified HC 6GB Mushkin DDR3@8-8-8-19 1640 
Hard DriveOSMonitorPower
64GB Kingston v100 SSD/500GB Seagate+500GB WD R... Windows 7 Ultimate x64 NEC EA232WMi+2xemachines 5760x1080 surround Cooler Master 850w 
Case
Corsair 800D 
  hide details  
post #50 of 60
There's an option in the menus for Fermi clocks and volts. Don't change anything you don't fully understand.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
This thread is locked  
Overclock.net › Forums › Graphics Cards › NVIDIA › Need help on forcing 3D clocks on my GTX 460 (to fix stuttering issue)