Overclock.net › Forums › Graphics Cards › NVIDIA › [OFFICIAL] GTX 670 Overclocking
New Posts  All Forums:Forum Nav:

[OFFICIAL] GTX 670 Overclocking - Page 39

post #381 of 1562
There's only two reasons that performance will drop as overclock goes up. Either you pushed your card past a temperature threshold (70C and 80C) that made it throttle itself down or your overclock is unstable. Easiest way to check which it is is to check your max temperature after an entire bench run and if it's over 70C then increase your fan speed just enough so it never goes over 70C and then rerun it. If performance goes up after that, then you know it was the temperature causing issues. If stayed the same, then your overclock is unstable. Usually only unstable memory overclock will actually reduce performance. An unstable core overclock usually will just BSOD, crash your display driver, or cause artifacting.
post #382 of 1562
Quote:
Originally Posted by Exolaris View Post

Can anyone explain to me where everyone is getting this 6Ghz memory speed from? My GTX 670 Windforce OC is listing at like 1700 MHz memory clock in Precision and GPU-Z, and that's with a +425 OC on it. Also, my graphics score in 3dMark11 was 9847, but my physics score was only 7767. I have an i5 2500k OC'ed to 4.5 GHz, should my physics score really be that low?

1700MHz (the number in GPU-Z) is the actual memory frequency. But DDR5 is quad-data-rate memory, in other words, 4 bits are transferred per clock cycle rather than one. So the effective memory frequency could be said to be 4 x 1700MHz (actual) = 6800MHz..

Afterburner and Precision are still calibrated to DDR3, which is dual-data-rate memory, that's why your 1700MHz appears as 3400MHz in those tools. The three multipliers that could be used (although the one used by AB/Prec is actually totally invalid cause it's not DDR3 on these cards) is why the numbers you're seeing referred to in all the Kepler (or Fermi for that matter) threads may seem all over the place AFA mem clocks go.

AFA the other question goes, I believe HT might make a big difference for Physics score, and your score sounds right for a non-HT chip.
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #383 of 1562
Should the boost clock always kick in when you play a game? I am now having issues with BF3 where it only goes to my overclock settings and doesn't even kick in the boost clock.


Edit: I was testing this while streaming and it actually looks like my CPU actually bottlenecks my overclock or something while I'm livestreaming.. That is what caused the no boost block for BF3..

Although, Arma 2 not pushing towards the boost clock off of streaming still is puzzling.


Edit: Now it looks fine, so confused D:
Edited by Lirik - 5/30/12 at 6:40pm
    
CPUMotherboardGraphicsRAM
Intel Core i5-2500k @ 4.1ghz GA-Z68X-UD4-B3 MSI N570GTX Twin Frozr III 900/1800/2000 @ 1.081v G.Skill Ripjaws-X 8 gb (2x4gb) 
OS
Windows 7 64 bit Ultimate 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5-2500k @ 4.1ghz GA-Z68X-UD4-B3 MSI N570GTX Twin Frozr III 900/1800/2000 @ 1.081v G.Skill Ripjaws-X 8 gb (2x4gb) 
OS
Windows 7 64 bit Ultimate 
  hide details  
Reply
post #384 of 1562
Quote:
Originally Posted by Lirik View Post

Should the boost clock always kick in when you play a game? I am now having issues with BF3 where it only goes to my overclock settings and doesn't even kick in the boost clock.
Edit: I was testing this while streaming and it actually looks like my CPU actually bottlenecks my overclock or something while I'm livestreaming.. That is what caused the no boost block for BF3..
Although, Arma 2 not pushing towards the boost clock off of streaming still is puzzling.
Edit: Now it looks fine, so confused D:

2500k should not bottleneck BF3+streaming. My 670 maxes out its clocks even playing graphically trivial games like league of legends. Your boost clock may not kick in for one of two reasons: overclock is not stable (at current power target), or temperatures are above 70 degrees. If you have already maxed your power target, then lower your core clock a little, try again. Use the full performance monitoring graphs (double click the small graphs) in precision to see how your temperatures behave. For your initial stability testing just manually set the fans to full blast.

Edit: I haven't seen anyone else report this but I found instability when maxing out the Voltage setting in precision, so I just left it at the default. Also when I maxed out power target (122% for stock evga) I saw crashes in games (not in Heaven). I have now been running stable with the same clocks at 120% power target for a week. So I would recommend that once you find stable clocks lower your power target until you see Heaven scores drop.
Edited by zerocraft - 5/30/12 at 8:44pm
post #385 of 1562
Quote:
Originally Posted by zerocraft View Post

Quote:
Originally Posted by Lirik View Post

Should the boost clock always kick in when you play a game? I am now having issues with BF3 where it only goes to my overclock settings and doesn't even kick in the boost clock.
Edit: I was testing this while streaming and it actually looks like my CPU actually bottlenecks my overclock or something while I'm livestreaming.. That is what caused the no boost block for BF3..
Although, Arma 2 not pushing towards the boost clock off of streaming still is puzzling.
Edit: Now it looks fine, so confused D:

2500k should not bottleneck BF3+streaming. My 670 maxes out its clocks even playing graphically trivial games like league of legends. Your boost clock may not kick in for one of two reasons: overclock is not stable (at current power target), or temperatures are above 70 degrees. If you have already maxed your power target, then lower your core clock a little, try again. Use the full performance monitoring graphs (double click the small graphs) in precision to see how your temperatures behave. For your initial stability testing just manually set the fans to full blast.

Edit: I haven't seen anyone else report this but I found instability when maxing out the Voltage setting in precision, so I just left it at the default. Also when I maxed out power target (122% for stock evga) I saw crashes in games (not in Heaven). I have now been running stable with the same clocks at 120% power target for a week. So I would recommend that once you find stable clocks lower your power target until you see Heaven scores drop.
I've definitely seen low boost clocks in less demanding games. Shoot in MW3 I was at like 900mhz.
X99 Build
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5820k Asus X99-AII Gigabyte GTX 1070 Corsair 16GB 3200MHZ 
Hard DriveCoolingOSMonitor
Samsung EVO 850 250GB Scythe Mugen Max Windows 10 X-Star 1440P 
KeyboardPowerCaseMouse
CM Masterkeys L XFX BE 750W Phanteks Enthoo Pro Tempered Glass Sensei Raw 
Mouse PadAudio
Steelseries Qck Logitech Z313 
  hide details  
Reply
X99 Build
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5820k Asus X99-AII Gigabyte GTX 1070 Corsair 16GB 3200MHZ 
Hard DriveCoolingOSMonitor
Samsung EVO 850 250GB Scythe Mugen Max Windows 10 X-Star 1440P 
KeyboardPowerCaseMouse
CM Masterkeys L XFX BE 750W Phanteks Enthoo Pro Tempered Glass Sensei Raw 
Mouse PadAudio
Steelseries Qck Logitech Z313 
  hide details  
Reply
post #386 of 1562
nice wink.gif i think i got stable at about + 480 offset on the memory, but it started to sketch out a little after than. Weirdly it was dying on the combined physics test. Although im 100% my bench clock is stable. but +480 on the mem aint bad at all. I think i topped at around 10638 in 3d mark 11. I guess that 10700 isnt going to reached until they stop being cheap asses and unlock the dam voltages. Also i never say a temp higher than 62c with fan speed on auto ( never went about 45%). I just reapplied the thermal paste with some mx4. Seems to be doing the job nicely. wink.gif

I might have a tweak today, im stupidly ill and stuck at home mad.gif
post #387 of 1562
Quote:
Originally Posted by brettjv View Post

AFA the other question goes, I believe HT might make a big difference for Physics score, and your score sounds right for a non-HT chip.

I appreciate the response, but I'm a little confused about this part. Unless I'm making a false assumption somewhere along my line of reasoning, if a 2600k or other HT CPU gives significant boosts to a 3dMark Physics score, doesn't that imply that a hyperthreaded CPU would provide some sort of tangible benefit for gaming? I ask because I was under the impression that most people recommended only getting a 2600k over a 2500k if you were going to be doing CPU-intensive activities like video editing, encoding, or folding. Why would that be the recommendation if HT provides significant physics processing gains in games?
post #388 of 1562
Quote:
Originally Posted by Exolaris View Post

I appreciate the response, but I'm a little confused about this part. Unless I'm making a false assumption somewhere along my line of reasoning, if a 2600k or other HT CPU gives significant boosts to a 3dMark Physics score, doesn't that imply that a hyperthreaded CPU would provide some sort of tangible benefit for gaming? I ask because I was under the impression that most people recommended only getting a 2600k over a 2500k if you were going to be doing CPU-intensive activities like video editing, encoding, or folding. Why would that be the recommendation if HT provides significant physics processing gains in games?

The 3dmark physics test is synthetic. Few games take full advantage of 4 cores. When hyperthreading is used, the speed of each thread will be slower. Hyperthreading has a negative effect on fps in most games, albeit marginal.
My System
(17 items)
 
  
CPUMotherboardGraphicsRAM
3570k @ 4.4ghz Asus Z77 Deluxe GTX970 SLI 24GB 
MonitorKeyboardPowerCase
Acer XB270HU CM QuickFire Red - PBT white keycaps Seasonic X-850 Raven RV02 
MouseMouse PadAudioOther
Roccat Savu Zowie P-TF Onkyo SE-90 Marantz PM7003 amplifier 
OtherOther
Wharfedale 10.1 speakers Sennheiser HD598 
  hide details  
Reply
My System
(17 items)
 
  
CPUMotherboardGraphicsRAM
3570k @ 4.4ghz Asus Z77 Deluxe GTX970 SLI 24GB 
MonitorKeyboardPowerCase
Acer XB270HU CM QuickFire Red - PBT white keycaps Seasonic X-850 Raven RV02 
MouseMouse PadAudioOther
Roccat Savu Zowie P-TF Onkyo SE-90 Marantz PM7003 amplifier 
OtherOther
Wharfedale 10.1 speakers Sennheiser HD598 
  hide details  
Reply
post #389 of 1562
Quote:
Originally Posted by lightsout View Post

I've definitely seen low boost clocks in less demanding games. Shoot in MW3 I was at like 900mhz.

Are you running vsync? My boost clock is pretty much maxed in MW3 for my 120fps target (I am using virtu for anti-tearing so vsync is off everywhere)
post #390 of 1562
Quote:
Originally Posted by whybother View Post

The 3dmark physics test is synthetic. Few games take full advantage of 4 cores. When hyperthreading is used, the speed of each thread will be slower. Hyperthreading has a negative effect on fps in most games, albeit marginal.

There's this ^^^ and to be a little more specific (of what he means by 'synthetic') ... there's also the design of that physics test. It is specifically designed to place the bottleneck onto the CPU and not the GPU ... that's why it look so janky ... all blocky, no AA, no shadows, no pixel shading, single static lighting source, etc.

The express purpose of that test is to test your multi-threaded CPU performance. It's basically not reflective of how an actual game would stress your system. There'd never be such a high ratio of physics calcs (cpu) vs. rendering calcs (gpu) in a real game. Plus like WB said above, the vast (like 99.9%) majority of games aren't coded to use >4 threads ... most can only use 1 or 2, in fact.
Edited by brettjv - 5/31/12 at 9:38am
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › [OFFICIAL] GTX 670 Overclocking