Overclock.net › Forums › Graphics Cards › NVIDIA › Galaxy 9600GT OC - some questions
New Posts  All Forums:Forum Nav:

Galaxy 9600GT OC - some questions

post #1 of 9
Thread Starter 
My brand new Galaxy KFA2 9600GT OC arrived from NCIX today! I'm so happy. I'm running it on ForceWare 174.16 by Xtreme-G and just set up RivaTuner 2.07 to monitor clocks and fans and temp, etc. But I'm a bit confused by what I see:


  1. The specs from the above link say 1674MHz Shader Clock, but Riva's hardware monitorin says 1600MHz. When I go to the overclocking tab it says it's set at 1625MHz. What gives?
  2. I was about to ask why memory clock says 1000MHz when the specs say 2000MHz, but then I read the specs closer and answered my own question: 512MB 256-bit 1ns GDDR3 Memory (1000MHz clock - 2000MHz effective)
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
post #2 of 9
1. In regards to this, I'm not quite sure. This MAY have something to do with your PCI-E bus speed, but I don't know for sure, so I'll let someone else answer...

2. DDR stands for Double Data Rate. Essentially to keep things very simple, whatever bus speed the RAM is functioning at can be double with DDR, therefore making your 1000MHz bus on your video memory 2000MHz effective. Why this works, is because as opposed to standard RAM, your memory is transferring data on both the rising and falling edges of a clock signal, where as RAM prior to DDR didn't. It seems as though you figured this one out though, just giving you the reasoning as to why it works this way
i7
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-940 EVGA X58 BFG 280GTX 1GB 6GB Corsair XMS3 DDR3-1366 
Hard DriveOSMonitorPower
2xSeagate 7200 1.5TB Windows 7 x64 Dell 2405FPW Corsair HX 620 
CaseMouse
Lian Li X500B G5 
  hide details  
Reply
i7
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-940 EVGA X58 BFG 280GTX 1GB 6GB Corsair XMS3 DDR3-1366 
Hard DriveOSMonitorPower
2xSeagate 7200 1.5TB Windows 7 x64 Dell 2405FPW Corsair HX 620 
CaseMouse
Lian Li X500B G5 
  hide details  
Reply
post #3 of 9
Thread Starter 
Okay so I think I'm getting the hang of this overclocking business. I'm using ATITool 0.27 and running the artifact scan, meanwhile I'm using RivaTuner 2.07 to ramp up my clocks slowly. First I did core clock, and found that the artifact scan went on yellow when I got up to 780MHz, but looked stable at 775MHz. I then ran 3DMark06 but it crashed on the 2nd test (Firefly Forest). So I bumped the core down to 771MHz and it passed the test -- I reconfirmed this by testing 775MHz again and sure enough it crashed.

So then I set core back to default 675MHz and am now starting to do Shader clock. This seems to be an interesting one, I noticed that it was set at 1625MHz but the hardware monitoring was only showing 1600MHz. I increased the shader in steps but found that the corresponding graph seemed to only change in increments of 50MHz. Is this normal? Right now I'm at 1850MHz and no problems with the artifact scanning. Oh and I am keeping a close eye on my temps, right now core is reading 58°C (I have set up Riva to increase the fan to max upon reaching 50°C).

I'm relatively new to NVIDIA card overclockin, but would you guys say I have the right idea? And after I find all 3 max clocks I should be able to consolidate and tada that's my max stable overclock, right?
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
post #4 of 9
Yes it's normal, in fact ALL of your clocks will go up in 'chunks', its just more noticeable on the Shaders cause it'll be such a big jump.

And yes you have the right idea. OC ON my friend!
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #5 of 9
i can get my shader clock to 1850 stable with stock 675 core. And i can get the core to 775 stable with stock 1700 shader. But when i run the shader AND the core clock at those overclocked speeds i get crashes and blue screens. seems the shader clock doesnt like to go above 1750 when my core is 750mhz or higher so i just unlinked the shader and core clock to set them seperate. right now im running at 750 core , 1725 shader and 2000 mem.
~TheDarkTower~
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 955 BE @ 4.0 GHZ [1.425v] [Stable] Gigabyte GA-870A-USB3 Evga Gtx-260 Core 216 896 MB [725/1475/2400] G-Skill Ripjaws Series DDR3-1333 8GB [2x4GB] 
Hard DriveOptical DriveOSMonitor
Seagate 500GB Sata [7200 RPM] Lite On CD/DVD BURNER [Sata] Windows 7 Ultimate [64-Bit] Emachine 19" Widescreen HD [1440x900] 
PowerCase
NZXT 800 Watts [Sli Ready] Cooler Master Elite 430 
  hide details  
Reply
~TheDarkTower~
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 955 BE @ 4.0 GHZ [1.425v] [Stable] Gigabyte GA-870A-USB3 Evga Gtx-260 Core 216 896 MB [725/1475/2400] G-Skill Ripjaws Series DDR3-1333 8GB [2x4GB] 
Hard DriveOptical DriveOSMonitor
Seagate 500GB Sata [7200 RPM] Lite On CD/DVD BURNER [Sata] Windows 7 Ultimate [64-Bit] Emachine 19" Widescreen HD [1440x900] 
PowerCase
NZXT 800 Watts [Sli Ready] Cooler Master Elite 430 
  hide details  
Reply
post #6 of 9
Thread Starter 
Yes I found I had to unlink the two from the very start. I get quite different overclocks than you though, right now i'm running 760 core, 1800 shader and 1080 mem (2160 effective)

It's only my mem I'm concerned about, you were stating it in DDR values right?
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
post #7 of 9
i get a 2000mhz effetive mem speed
~TheDarkTower~
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 955 BE @ 4.0 GHZ [1.425v] [Stable] Gigabyte GA-870A-USB3 Evga Gtx-260 Core 216 896 MB [725/1475/2400] G-Skill Ripjaws Series DDR3-1333 8GB [2x4GB] 
Hard DriveOptical DriveOSMonitor
Seagate 500GB Sata [7200 RPM] Lite On CD/DVD BURNER [Sata] Windows 7 Ultimate [64-Bit] Emachine 19" Widescreen HD [1440x900] 
PowerCase
NZXT 800 Watts [Sli Ready] Cooler Master Elite 430 
  hide details  
Reply
~TheDarkTower~
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 955 BE @ 4.0 GHZ [1.425v] [Stable] Gigabyte GA-870A-USB3 Evga Gtx-260 Core 216 896 MB [725/1475/2400] G-Skill Ripjaws Series DDR3-1333 8GB [2x4GB] 
Hard DriveOptical DriveOSMonitor
Seagate 500GB Sata [7200 RPM] Lite On CD/DVD BURNER [Sata] Windows 7 Ultimate [64-Bit] Emachine 19" Widescreen HD [1440x900] 
PowerCase
NZXT 800 Watts [Sli Ready] Cooler Master Elite 430 
  hide details  
Reply
post #8 of 9
Quote:
Originally Posted by amdcpu4life View Post
i can get my shader clock to 1850 stable with stock 675 core. And i can get the core to 775 stable with stock 1700 shader. But when i run the shader AND the core clock at those overclocked speeds i get crashes and blue screens. seems the shader clock doesnt like to go above 1750 when my core is 750mhz or higher so i just unlinked the shader and core clock to set them seperate. right now im running at 750 core , 1725 shader and 2000 mem.
Actually I believe that would be 756 core, 1728 shader and 1998 memory

Here's a little 'article' on another forum concerning this subject...

http://www.thetechrepository.com/showthread.php?t=133
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #9 of 9
Thread Starter 
Quote:
Originally Posted by brettjv View Post
Actually I believe that would be 756 core, 1728 shader and 1998 memory

Here's a little 'article' on another forum concerning this subject...

http://www.thetechrepository.com/showthread.php?t=133
Looking over that article, I can see they based it on an 8800GTX. From my experience with the 9600GT, I can confirm that mine has different "zones" from the 8800GTX. For example, his shader goes up in 23MHz zones whereas mine goes in 50MHz zones. I suspect every card is different so I don't think you can use his guide to say definitively what another person's clocks "mean".
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7-2600K ASUS P8P67 Deluxe KFA2 9600GT 512Mb @ 743/1750/1026 Mushkin Silverline 2x4GB DDR3-1333 
Hard DriveOS
OCZ Vertex 2 60Gb SSD Windows 7 x64 Ultimate 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › Galaxy 9600GT OC - some questions