Overclock.net › Forums › Graphics Cards › Graphics Cards - General › What controls a GFX card clock?
New Posts  All Forums:Forum Nav:

What controls a GFX card clock?

post #1 of 4
Thread Starter 
In the CPU its teh multi X FSB, but how do GPU's acheive their clock speeds? And why is it so easy to change them compared to a CPU whioch requires a BIOS to effectivley change them?

Also on an unrelated matter, how come the FSB on CPU's is never EXACT, its always 249.46 or 251.67, why not 250?
Oppress This!
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 2 X4 B55BE @ 3.7 ASUS M4A89GTD-PRO USB3 GTX 470 + GTX 275 SOC 4gb Crucial Ballistix T 1600 6-8-6-20-24 1T 
Hard DriveOptical DriveOSMonitor
2xWD640AAHLS + Hitachi 2TB 7200RPM + 2x150gb VRapt HP IDE + LG SATA Win 7 x64 ult 15" 1024x768 Dell (visually impaired) 
PowerCase
XFX XXX 650w (green fan) Antec 300 (Heavily Modded) 
  hide details  
Reply
Oppress This!
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 2 X4 B55BE @ 3.7 ASUS M4A89GTD-PRO USB3 GTX 470 + GTX 275 SOC 4gb Crucial Ballistix T 1600 6-8-6-20-24 1T 
Hard DriveOptical DriveOSMonitor
2xWD640AAHLS + Hitachi 2TB 7200RPM + 2x150gb VRapt HP IDE + LG SATA Win 7 x64 ult 15" 1024x768 Dell (visually impaired) 
PowerCase
XFX XXX 650w (green fan) Antec 300 (Heavily Modded) 
  hide details  
Reply
post #2 of 4
On the first i have no answer, but for the second i can say that it involves the Clock generator circuit. usually only the really good ones are that precise, or extremely close to it. Mine sits usually within 100khz of what i set it as. The lower the quality of the board and clock generator, the less precise control you have over it.

At least that is how i understand it.
Leviathan
(17 items)
 
Charred
(10 items)
 
 
CPUMotherboardGraphicsGraphics
Xeon E5-2690 Biostar TPower X79 PNY GTX 660 2GB HP GT 440 OEM 
RAMHard DriveHard DriveHard Drive
Gskill Ripjaws 4x2GB 1600mhz Seagate Barracuda 500GB Seagate Barracuda 1.5TB Western Digital Caviar Blue 640GB 
Hard DriveCoolingOSMonitor
Patriot Pyro 60GB Xigmatek Gaia Windows 7 Ultimate Acer S230HL 
MonitorKeyboardPowerCase
Princeton 1280x1024 19" Logitech K120 Seasonic G550 Xclio Nighthawk 
Mouse
Logitech MX310 
  hide details  
Reply
Leviathan
(17 items)
 
Charred
(10 items)
 
 
CPUMotherboardGraphicsGraphics
Xeon E5-2690 Biostar TPower X79 PNY GTX 660 2GB HP GT 440 OEM 
RAMHard DriveHard DriveHard Drive
Gskill Ripjaws 4x2GB 1600mhz Seagate Barracuda 500GB Seagate Barracuda 1.5TB Western Digital Caviar Blue 640GB 
Hard DriveCoolingOSMonitor
Patriot Pyro 60GB Xigmatek Gaia Windows 7 Ultimate Acer S230HL 
MonitorKeyboardPowerCase
Princeton 1280x1024 19" Logitech K120 Seasonic G550 Xclio Nighthawk 
Mouse
Logitech MX310 
  hide details  
Reply
post #3 of 4
For the GPUs it's the same thing as for the CPUs roughly speaking, but they don't have ancient FSB-like technologies. There is no need for chipset on a graphics card as all is integrated. I remember an older Nvidia card I had, the 8600GT - it could only OC in multiples of 12 or 16MHz, so it was sort of like changing the multiplier and the base clock was 12MHz. Of course 12MHz isn't directly comparable with FSB 1333 as the underlying technology is completely different, but the principle is roughly the same. It's easier to OC GPUs only because a) manufacturers allow it, GPUs are based on simpler circuits and b) OC-ing related errors on GPUs can create random pixel corruption which is tolerable, while CPU errors would probably crash the computer.

They don't have exact numerals because any clock speed is based on an oscillating circuit that is more or less stable; its speed can vary a tiny bit with temperature, a bit with a spike in voltage, so although they may try to have 250 in the end it may come around 251.49, but it would always be close of that desired value.

...got a bit carried away explaining, hope it's what you were looking for
Centurion
(14 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II 940 @3.3GHz 1.25V Gigabyte 790X-DS4 XFX 5850 8GB OCZ DDR2-800 
Hard DriveCoolingOSMonitor
Corsair Force 3 TRUE Win 7 Pro x64 ACER 23" TN  
PowerCase
400W Silverstone Strider Antec P182 
  hide details  
Reply
Centurion
(14 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II 940 @3.3GHz 1.25V Gigabyte 790X-DS4 XFX 5850 8GB OCZ DDR2-800 
Hard DriveCoolingOSMonitor
Corsair Force 3 TRUE Win 7 Pro x64 ACER 23" TN  
PowerCase
400W Silverstone Strider Antec P182 
  hide details  
Reply
post #4 of 4
Thread Starter 
Quote:
Originally Posted by dragosmp View Post
For the GPUs it's the same thing as for the CPUs roughly speaking, but they don't have ancient FSB-like technologies. There is no need for chipset on a graphics card as all is integrated. I remember an older Nvidia card I had, the 8600GT - it could only OC in multiples of 12 or 16MHz, so it was sort of like changing the multiplier and the base clock was 12MHz. Of course 12MHz isn't directly comparable with FSB 1333 as the underlying technology is completely different, but the principle is roughly the same. It's easier to OC GPUs only because a) manufacturers allow it, GPUs are based on simpler circuits and b) OC-ing related errors on GPUs can create random pixel corruption which is tolerable, while CPU errors would probably crash the computer.

They don't have exact numerals because any clock speed is based on an oscillating circuit that is more or less stable; its speed can vary a tiny bit with temperature, a bit with a spike in voltage, so although they may try to have 250 in the end it may come around 251.49, but it would always be close of that desired value.

...got a bit carried away explaining, hope it's what you were looking for
Exactly what i wanted, +rep
Oppress This!
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 2 X4 B55BE @ 3.7 ASUS M4A89GTD-PRO USB3 GTX 470 + GTX 275 SOC 4gb Crucial Ballistix T 1600 6-8-6-20-24 1T 
Hard DriveOptical DriveOSMonitor
2xWD640AAHLS + Hitachi 2TB 7200RPM + 2x150gb VRapt HP IDE + LG SATA Win 7 x64 ult 15" 1024x768 Dell (visually impaired) 
PowerCase
XFX XXX 650w (green fan) Antec 300 (Heavily Modded) 
  hide details  
Reply
Oppress This!
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom 2 X4 B55BE @ 3.7 ASUS M4A89GTD-PRO USB3 GTX 470 + GTX 275 SOC 4gb Crucial Ballistix T 1600 6-8-6-20-24 1T 
Hard DriveOptical DriveOSMonitor
2xWD640AAHLS + Hitachi 2TB 7200RPM + 2x150gb VRapt HP IDE + LG SATA Win 7 x64 ult 15" 1024x768 Dell (visually impaired) 
PowerCase
XFX XXX 650w (green fan) Antec 300 (Heavily Modded) 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Graphics Cards - General
Overclock.net › Forums › Graphics Cards › Graphics Cards - General › What controls a GFX card clock?