Hello everyone,
I was curious (read:bored) about what the effect of overclocking GPU and CPU was in a banchmarking application. This is my first time performing a test like this so please bear with me if I have made any mistake or have skipped over some impoprtant aspect.
There are many applications and each one utilizes different resources from CPU and GPU but I decided to use Unigine Heaven as a benchmark.
I tested three Intel i5 2500k CPU clocks: 3.33 , 4.0 and 4.5 and just one GPU OC since i am currently in a very hot room and I dont want to push my GPUs to extreme levels going from stock sttings in my two PNY GTX 560 Ti in SLI of 822/1645/2004 to 900/1800/2103
I only ran a test with each setting once, so it is not a very scientifical method but seeing the results it looks like it didnt really matter.
Here are the settings for the test:
And here are the results in all six combinations of my CPU and GPU overclocks:
Even though I knew that Unigine Heaven is a very GPU intensive benchmark I thought a higher CPU might help scratch a few FPS.
It turns out thar the effect of CPU OC is absolutely zero. There are some minor fluctuations but as I mentioned before, i only ran each test once, so if I did each test 3 or 5 times the results would have probably been almost identical accross the every CPU setting.
On the other hand, OCing the GPU will give us an increase of 6-7 fps in average. which is pretty sweet considering the heat was only slightly superior to stock settings.
As a conclusion, this mini test confirms the theory that as long as your CPU is not bottlenecking the GPUs there will be no substantial changes in performance in very GPU intensive applications such as Unigine Heaven.
Please take into account that this doesnt mean that other applications/games such as Bad Company 2 or Metro 2033 might not benefit from a higher CPU clock.
Please comment and if you have any suggestions on how to improve an amateur test like this or want me to look into something that I havent gone through or analyzed in this test I would me more than glad to do it!
Cheers!
I was curious (read:bored) about what the effect of overclocking GPU and CPU was in a banchmarking application. This is my first time performing a test like this so please bear with me if I have made any mistake or have skipped over some impoprtant aspect.
There are many applications and each one utilizes different resources from CPU and GPU but I decided to use Unigine Heaven as a benchmark.
I tested three Intel i5 2500k CPU clocks: 3.33 , 4.0 and 4.5 and just one GPU OC since i am currently in a very hot room and I dont want to push my GPUs to extreme levels going from stock sttings in my two PNY GTX 560 Ti in SLI of 822/1645/2004 to 900/1800/2103
I only ran a test with each setting once, so it is not a very scientifical method but seeing the results it looks like it didnt really matter.
Here are the settings for the test:
And here are the results in all six combinations of my CPU and GPU overclocks:
Even though I knew that Unigine Heaven is a very GPU intensive benchmark I thought a higher CPU might help scratch a few FPS.
On the other hand, OCing the GPU will give us an increase of 6-7 fps in average. which is pretty sweet considering the heat was only slightly superior to stock settings.
As a conclusion, this mini test confirms the theory that as long as your CPU is not bottlenecking the GPUs there will be no substantial changes in performance in very GPU intensive applications such as Unigine Heaven.
Please take into account that this doesnt mean that other applications/games such as Bad Company 2 or Metro 2033 might not benefit from a higher CPU clock.
Please comment and if you have any suggestions on how to improve an amateur test like this or want me to look into something that I havent gone through or analyzed in this test I would me more than glad to do it!
Cheers!