Overclock.net banner

[Review]Effect of CPU overclock on GPUs performance

13K views 88 replies 26 participants last post by  tubers 
#1 ·
This is still work in progress

General

This has been a bit of a debate. How much does the CPU actually affect the scaling of our nice GPUs.
This is not about how SLI scale, but how the CPU itself affect the how the GPU runs.

Some say (and I among them) that the GPUs, especially in multi-GPU settings, are being majorly affected by the CPU scaling.

Meaning that if you run your CPU in stock speed vs high OCed speeds, it will affect your FPS in a pretty major way.

This is also the reason why some (and I included), frown when we see a benchmark test of a GPU, where the CPU is hardly OCed, or even at stock.

So for all of us, here is a review and maybe another way to see if we are right, or wrong.

Method

General:

So, what I did was spend the time running several DX9, DX10 and DX11 benchmarks, of certain games that I own, including "artificial" benchmarks like 3DMark or Heaven.

I wanted at first test with the 980x, and later with a 920, but currently I don't have enough time, and each set of benchmarks takes almost half a day to run several times to get a more accurate results.

So instead, I decided to use my every day overclock of 4.3ghz, and make it the base line, and from there just reduce the multiplier by 2.

This gave me CPU "levels" including decent overclock and up to a very low down-clock, including a no HT test.

This way, I got 4.3ghz, 4ghz, 3.6ghz, 3ghz and 2.5ghz (and the last one was also tested without HT).
Memory and Uncore speeds where completely unchanged.

Here is the picture of the test bed:
Test%20bed.png


About the GPUs:

I'm going to run all the benchmarks with the GPUs at a decent OC (going to be run at core 920mhz, and memory 4400mhz).
The reason I'm doing this (and not going overboard with OC or at stock), is to try and make the GPUs strong enough to be starve for CPU time to get the data, especially in multi-gpu setups.
That way, if the CPU speed is affecting the score, the lower the CPU, the higher the GPUs are affected, and lower the score it will get.

About resolution:

As my sig suggest, I own 3 monitors, but I'm not going to test anything in surround.
I'm going to run as close to the "common" gamer setup, which is either a 1080p or 1200p setups.
So I'm going to run all the benchmarks at 1920x1200, which is slightly above the 1080p, and it will stress the GPUs enough to see the difference (if any).

About benchmarks settings:

I'm going to run as high settings as I can regarding textures and quality without causing 1 FPS issues in single GPU.
AA I am not going to try to run the highest, and most likely I will settle between 2 and 8, depends on the game.
The reason is that I plan to run the games with a single GPU as well. And I don't want a single frame feast in a benchmark when I run it single card, and to make the tests as much similar as I can.
Also, I don't want to get to a spot where GPU vram is limiting the benchmark and making false results.
But I'm also not going to run it without AA and reach the 200fps every time.

The benchmarks

So, what am I going to test?
I wanted several games, from DX9 to DX11.
Sadly I don't own that many games with built-in benchmarks or have the time to play each game, test with fraps and so on.
So I'm just using games with known benchmarks, and some "artificial" benchmarks to test capability.

In the future I might add a few games if there is enough request, and if I can get the game for cheap (
tongue.gif
).
But the games I add include several types of engines, so no worries there.

DX9 Games

Crysis 2:
  • Adrenaline benchmark
  • Extreme
  • 4xAA
  • Map: Default
Mafia 2:
  • Built in benchmark
  • Highest settings
  • AA on
  • Physx off
DX10 Games

Crysis warhead:
  • FB benchmark
  • Enthusiast settings
  • 4xAA
  • Map: ambush - time of day 10
DX11 Games

Dirt 2
  • Internal benchmark
  • Ultra settings
  • 8xMSAA
Metro 2033:
  • Built-in benchmark
  • Very high settings
  • 4xMSAA
  • Physx off, Tessellation on, DoF off.
Artificial benchmarks

3DMark 11: Performance run, GPU score only
3DMark Vantage: Performance run, GPU score only
Heaven 2.5: Tessellation Extreme, 2xAA, 16xAF.

Results

Note again, that this is a work in progress thread, as benchmarks takes time to run, so I will update it as I go, as some tests are missing due to inconsistency which I wanted to retest later

And now, after all this talk and explaining, here are the test results (*drum rolls*
drum.gif
):

3DMark%2011.png


3DMark%20Vantage.png


Heaven%202.5.png


Crysis%202.png


Crysis%20Warhead.png


Dirt%202.png


Mafia%202.png


Metro%202033.png


Conclusion

Well as you can see, the results are pretty much plain and simple.
CPU does affect the FPS. This is clear.

But

To single card rig, the CPU speed does not affect almost at all from the tests done here.

And of course, this is majorly affected by the game engine.

Some games, like Metro 2033 or Crysis 2, which have quite a new game engine, the GPUs and FPS isn't very much affected by CPU speeds when it comes to Dual-GPU.
Others are affected, but above the 3.6, its not a very high difference.
As said, its really game dependent. You can see that Crysis Warhead has major differences in almost all the CPU speeds. Same with Mafia 2, but not as high once you pass the 3.6

For Tri-SLI or Quad-SLI, here is the biggest difference.
You can get up to 20 or 30 FPS difference, which will greatly affect any type of surround or large monitors.
(Scaling is another matter between Tri and Quad. I plan a review on that later on).

So over-all, the difference is noticeable.
And this is the reason why when giving a review, especially multi-GPU one, its required that you push the CPUs as much as possible to make sure that the GPUs are in no way starve for CPU time.
Even a 980x can easily starve a GPU. Let alone a 920 or 965 with mild overclock.

I'm not sure about the SB current top CPUs like the 2600K or 2500K, as both can be OCed a lot higher even on air. But you will have to reach that in order to not having them starve your GPUs in anyway.

For those with 2x Dual-GPUs, the scaling effect of the CPU is massive.
Especially from the PCIE and from CPU.

About HT off

What Hyper-Threading does is split each core to two threads, so the OS has the option to run more threads at the same time.
But, it has its drawbacks.
Massive-single threads or too heavy threads, will benefit the less power the CPU has to share between the threads.

This is the reason why at 2.5 with HT off, we get 6 strong cores vs 12 weak threads.
So the threads handling the drivers are stronger and faster to transfer the data to the GPUs, over running in weaker threads.

This result is fine.
And what it says is that on weak CPUs, or stock CPUs, its better to run with HT off, as games actually do not benefit from the HT.

I plan to add HT off for 4.3 to see results there on 6 vs 12 threads.

I know that I did no tests at 2560x1600, or surround, but as the tests were not about GPU scaling but effect of CPU overclock, I don't plan to do any any time soon.

I hope you enjoyed this little review.
If there are any errors, you are welcome to let me know
If any tests remain, I will complete them in the next few days as I get more time on my hands.

Cheers.
 
See less See more
11
#5 ·
Quote:
Originally Posted by pursuinginsanity;13268668
So.. why does metro perform best at 2.5ghz with no HT? (even with 4 gpus)

Your conclusion is tainted by this one anomalous result.
HT has a negative effect on some/most games?? I know I have read that more then once but can't confirm myself.
Also Metro is almost completely reliant on the GPU. Another thread shows CPU 2.5 to 4.0 with same GPU and has no increase. It wasn't a 980x though!!
 
#6 ·
Quote:
Originally Posted by pursuinginsanity;13268668
So.. why does metro perform best at 2.5ghz with no HT? (even with 4 gpus)

Your conclusion is tainted by this one anomalous result.
Not really.

HT "splits" the power of the cores into two.
So at 2.5, with 12 threads you have very little power per thread, so disabling HT gives you 6 beefy cores instead of 12 light headed threads.

Its normal, and the results are fine in that regard.
I will add that to the conclusions (I forgot).
 
#8 ·
Quote:
Originally Posted by pursuinginsanity;13268668
So.. why does metro perform best at 2.5ghz with no HT? (even with 4 gpus)

Your conclusion is tainted by this one anomalous result.
This is pretty interesting - is the effect of HT so bad that even increasing your speed by 2 ghz will get you fewer fps?

Other than that thanks a ton man, this is a good read.
 
#10 ·
Quote:
Originally Posted by OptimusCaik;13268787
So overclocking the CPU if one has a single GPU is completely redundant?
Especially if you run a cake CPU.
Its too awesome to be touched by the unholy hands of the overclocking god
wink.gif


And yes, but it depends on the CPU. Extremely weak CPUs will not be able to handle the math done on the CPU before being sent to the GPU.
 
#11 ·
Quote:
Originally Posted by Defoler;13268522
Some games, like Metro 2033 or Crysis 2, which have quite a new game engine, the GPUs and FPS isn't very much affected by CPU speeds when it comes to Dual-GPU.
A game engine's age has no effect on it CPU vs. GPU dependency, it's merely how it's coded. StarCraft 2 is newer than Metro and very CPU heavy.
Quote:
Originally Posted by Defoler;13268522
For Tri-SLI or Quad-SLI, here is the biggest difference.
You can get up to 20 or 30 FPS difference, which will greatly affect any type of surround or large monitors.
When you increase monitor resolution (either with Surround or a bigger monitor) you increase GPU dependency and decrease CPU dependency. You can't conclude how CPU-bound a game will be at a higher resolution by looking at these results.
Quote:
Originally Posted by Defoler;13268522
Even a 980x can easily starve a GPU. Let alone a 920 or 965 with mild overclock.
This made little sense in that context, you basically first said that reviewers have to push their CPU's as far as possible and then you said that even a mildly OC'd i7 920 is enough.
Quote:
Originally Posted by Defoler;13268522
For those with 2x Dual-GPUs, the scaling effect of the CPU is massive.
Especially from the PCIE and from CPU.
I have no idea what you meant with this. Rephrase, please?
Quote:
Originally Posted by Defoler;13268522
I know that I did no tests at 2560x1600, or surround, but as the tests were not about GPU scaling but effect of CPU overclock, I don't plan to do any any time soon.
2560x1600 would be just as relevant as CPU scaling will change at higher resolutions. Your rig is perfect for testing how much CPU power is needed for tri- or quad-SLI at 2560x1600.

Quote:
Originally Posted by OptimusCaik;13268787
So overclocking the CPU if one has a single GPU is completely redundant?
That's not what he concluded. It depends on the speed of the CPU. A Phenom II X2 powering a GTX 580 will benefit HUGELY from overclocking.
 
#12 ·
Quote:
Originally Posted by pursuinginsanity;13268668
So.. why does metro perform best at 2.5ghz with no HT? (even with 4 gpus)

Your conclusion is tainted by this one anomalous result.
Not really ... I think it just shows Metro suffers a small perf hit with HT on.

Nice work Defoler. Funny thing is, aside from Crysis Warhead, these tests aren't really the more cpu-dependent apps out there. Differences would be even greater with tests of, say, BFBC2, Far Cry 2, and Supreme Commander ... all of which require a good amount of CPU work per frame rendered.

One thing that I think is worth noting here though is that none of the games appear to become 'unplayable' with the proc at 2.5GHz. Yes, the bottlenecking effect is there, it's definitely 'measurable' ... but it does not look like the kind of differences one would likely physically 'notice' while gaming w/o an FPS meter running.

Of course, an i7 at 2.5 is still a very powerful processor sitting on a very robust platform. If you were to throw on results with something like a q6600 at stock (2.4GHz) ... then you'd see some *serious* bottlenecking going on, esp. with tri and quad SLI.

And lastly ... not that we didn't already suspect this, but boy is quad-SLI ever overkill for gaming at 1920 resolution! The scaling vs 3 gpu's is practically non-existent at this res, outside of the toughest DX11 synthetics here (3dMark11 and Heaven). Not even Metro scales very well, and it's obviously not a very cpu-dependent title, based on our handy charts
thumb.gif
 
#13 ·
Quote:
Originally Posted by B!0HaZard;13268911

This made little sense in that context, you basically first said that reviewers have to push their CPU's as far as possible and then you said that even a mildly OC'd i7 920 is enough.
You read it wrong.
It states that if the 980 can starve the GPUs, the 920 and 965 will do that even more.
Quote:
Originally Posted by B!0HaZard;13268911
2560x1600 would be just as relevant as CPU scaling will change at higher resolutions. Your rig is perfect for testing how much CPU power is needed for tri- or quad-SLI at 2560x1600.
You are welcome to provide me with a 2560x1600 monitor and I will do a review with that too.
Surround is not equal to single monitor, and this review is not about GPU power, but how the CPU affect it, not GPU scaling or how strong GPUs are to handle high resolutions.
You missed the whole point.
Quote:
Originally Posted by brettjv;13268940
Nice work Defoler. Funny thing is, aside from Crysis Warhead, these tests aren't really the more cpu-dependent apps out there. Differences would be even greater with tests of, say, BFBC2, Far Cry 2, and Supreme Commander ... all of which require a good amount of CPU work per frame rendered.
Maybe that is what you missed.

I don't want to see GPU scaling or Game scaling.
I want to see the CPU on GPU performance.
Not Game vs CPU or CPU vs GPU or resolution scaling.

I just wanted to answer one simple question, and I stated that several times:
Does CPU overclock affect the GPUs performance?

I added Crysis warhead as more of a check. And I needed a game with DX10 as well
tongue.gif

And of course, I need a game I have. And FC2 or BFBC2 aren't that great for a test like that.
 
#14 ·
My only real problem with this review is the lack of minimum framerates. The averages might be close while the minimum framerates vary hugely. Imagine that the average at 2.5 GHz is 55 FPS and the average at 4 GHz is 60 FPS. These numbers seems very close, but minimum framerates might be 20 FPS for 2.5 GHz and 40 FPS for 4 GHz if they're only that low for a short time. A CPU bottleneck will often happen at certain points in the benchmark (e.g. if there's a lot of physics calculations after an explosion) and even if it's a HUGE bottleneck at times, we won't see much difference in average framerates.
Quote:
Originally Posted by Defoler;13269041
You read it wrong.
It states that if the 980 can starve the GPUs, the 920 and 965 will do that even more.
Yeah, seems I did.
Quote:
Originally Posted by Defoler;13269041
You are welcome to provide me with a 2560x1600 monitor and I will do a review with that too.
Surround is not equal to single monitor, and this review is not about GPU power, but how the CPU affect it, not GPU scaling or how strong GPUs are to handle high resolutions.
You missed the whole point.
Surround is exactly like a single monitor in that to the GPU, it's just a single monitor with a very high and very wide resolution. I assumed you had a 2560x1600 monitor since you mentioned it, but Surround would be just as good for testing.

I didn't say it was about GPU power. CPU dependency will vary at different resolutions, so at higher resolutions (e.g. while using Surround), you'll see less CPU bottleneck. Therefore you'll see that the CPU affects the FPS less at high resolutions.

I didn't miss the whole point, I actually think you don't know what's causing these results. I don't think you understand what variables there are and what affects performance and how it affects it.
Quote:
Originally Posted by Defoler;13269110
Maybe that is what you missed.

I don't want to see GPU scaling or Game scaling.
I want to see the CPU on GPU performance.
Not Game vs CPU or CPU vs GPU or resolution scaling.

I just wanted to answer one simple question, and I stated that several times:
Does CPU overclock affect the GPUs performance?

I added Crysis warhead as more of a check. And I needed a game with DX10 as well
tongue.gif

And of course, I need a game I have. And FC2 or BFBC2 aren't that great for a test like that.
Um, the games he mentioned are MORE CPU dependent. They'd be affected MORE by CPU speed and they're therefore JUST AS relevant for this review as your GPU dependent games.

To answer your question: CPU overclock does affect the GPUs performance to some degree, especially in CPU dependent games like Far Cry 2 and BC2 and at low resolutions.
 
#15 ·
Quote:
Originally Posted by B!0HaZard;13269124
My only real problem with this review is the lack of minimum framerates. The averages might be close while the minimum framerates vary hugely. Imagine that the average at 2.5 GHz is 55 FPS and the average at 4 GHz is 60 FPS. These numbers seems very close, but minimum framerates might be 20 FPS for 2.5 GHz and 40 FPS for 4 GHz if they're only that low for a short time.
I can grantee you that minimal FPS is not affected by CPU.

I actually got 14.58 minimal FPS in metro 2033 at 2.5 with HT off, and 14.07 at 4.3 at metro 2033.
Or Dirt 2 for example has similar reduction of FPS across the board, usually 30 fps less than the average score.
So this is not the issue.

Quote:
Originally Posted by B!0HaZard;13269124
Surround is exactly like a single monitor in that to the GPU, it's just a single monitor with a very high and very wide resolution. I assumed you had a 2560x1600 monitor since you mentioned it, but Surround would be just as good for testing.

I didn't say it was about GPU power. CPU dependency will vary at different resolutions, so at higher resolutions (e.g. while using Surround), you'll see less CPU bottleneck. Therefore you'll see that the CPU affects the FPS less at high resolutions.
I disagree with you completely on all what you wrote.
Surround is not single monitor, and its not what I was going for at all.

Quote:
Originally Posted by B!0HaZard;13269124
Um, the games he mentioned are MORE CPU dependent. They'd be affected MORE by CPU speed and they're therefore JUST AS relevant for this review as your GPU dependent games.

To answer your question: CPU overclock does affect the GPUs performance to some degree, especially in CPU dependent games like Far Cry 2 and BC2 and at low resolutions.
Again, I disagree completely.
A high CPU usage game like FC2 for example, will show me false results.
If you increase the CPU, how do you know the GPU performs well, or the game engine performances well?
This will make the whole review useless.

Don't make this review to something you want, and now something it is.
Seems that you did not understand the review at all.
 
#16 ·
Quote:
Originally Posted by Defoler;13269110
Maybe that is what you missed.

I don't want to see GPU scaling or Game scaling.
I want to see the CPU on GPU performance.
Not Game vs CPU or CPU vs GPU or resolution scaling.

I just wanted to answer one simple question, and I stated that several times:
Does CPU overclock affect the GPUs performance?

I added Crysis warhead as more of a check. And I needed a game with DX10 as well
tongue.gif

And of course, I need a game I have. And FC2 or BFBC2 aren't that great for a test like that.
I didn't 'miss' anything DF. I fully understand 'the point' of the article/experiment
thumb.gif


All I'm telling you is that had you chosen to include results from some of the more CPU-dependent titles in this review, such as BFBC2 or Far Cry 2, you would have discovered that the CPU bottlenecking effect can be even more extreme than what is illustrated by the particular suite of games/benches that you did choose to test ... the large majority of which do NOT have a particularly high CPU demand per frame rendered ... aside from Warhead.

I'm not being critical, and I understand needing to work with what you have available. I'm just saying ... by and large these are actually more the 'best-case' scenarios as opposed to 'worst-case' scenarios in terms of the extent of the bottlenecking effect.
 
  • Rep+
Reactions: tats
#17 ·
Quote:
Originally Posted by brettjv;13269262
I didn't 'miss' anything DF. I fully understand 'the point' of the article/experiment
thumb.gif


All I'm telling you is that had you chosen to include results from some of the more CPU-dependent titles in this review, such as BFBC2 or Far Cry 2, you would have discovered that the CPU bottlenecking effect can be even more extreme than what is illustrated by the particular suite of games/benches that you did choose to test ... the large majority of which do NOT have a particularly high CPU demand per frame rendered ... aside from Warhead.
Understood
smile.gif


But that is what I was going for.
I wanted to make sure that what I can test, a game engine will not affect as much on the performance (except warhead).

I didn't want to ponder whether the increase in FPS was because the game runs better, or the GPUs has more room to breath.

This is why the single card is mainly included. Games which scale very little in single card or 2-way SLI, will show the real effect of CPU on the GPU/Drivers.
Also the reason why the FPS results are pretty high.
Because I didn't want to make the GPUs the bottle-neck.

Its like taking 4 kids to the playground, and see how they react and play when put in a 1 square foot playground, 10 square foot or 100 square foot.
And not if you strap on these kids and 40kg rock, and see how they play in a 100 square foot playground.
Quote:
Originally Posted by B!0HaZard;13269282
Are you ******ed?
No, but I think you have unresolved parental issues
smile.gif


If a person disagree with you, and that makes him "retarted", you are what you call others.

If you have a problem, you are welcome to keep it to yourself. Keep the thread clean.
Thanks.
 
#18 ·
Quote:
Originally Posted by Defoler;13269355
No, but I think you have unresolved parental issues
smile.gif


If a person disagree with you, and that makes him "retarted", you are what you call others.

If you have a problem, you are welcome to keep it to yourself. Keep the thread clean.
Thanks.
You just said that minimal FPS isn't affected by CPU. But aren't you CPU bottlenecked in most of the tri- and quad-SLI tests? So you're saying that because you're CPU bottlenecked, only your GPUs determine performance? The CPU isn't limiting the GPUs and isn't causing low minimum framerates?

The Metro 2033 benchmark is known to not give reliable minimum framerates. It's a faulty benchmark.

So first you said that the CPU doesn't affect minimum framerates and then you say it does in Dirt 2? If the FPS is constantly 30 FPS below average then it's logically not the same at different clock speeds. With an avg FPS of 60, you'd get 30 minimum and with an avg of 100, you'd get 70 minimum. You just said that it DOES in fact depend on the CPU.

You just said that Surround pixels =/= single monitor pixels. Whether or not you're using multiple displays isn't relevant to the test. All the GPUs are seeing, is a single monitor, albeit at 5760x1080. The GPUs don't know that the pixels are divided between 3 displays. A pixel is a pixel. Surround at 5760x1080 puts a higher load on the GPUs than a single monitor at 2560x1600, therefore being less dependent on CPU, therefore being relevant for the test because the CPU will need to be OC'd less to achieve max possible FPS with the GPUs in question.

You just said that using FC2 (a game) would give you a false result not related to what you're doing (testing performance in games) because it uses a different engine than the other games (which all have different engines). Besides that, you said that including more results would ruin everything (which it would not as the rest of the results are still there).

I apologize for my earlier post, but I've tried to reason with you and you obviously don't know how game performance works, how CPU and GPU work together or how to read your own results.
 
#19 ·
Quote:
Originally Posted by B!0HaZard;13269665
You just said that minimal FPS isn't affected by CPU. But aren't you CPU bottlenecked in most of the tri- and quad-SLI tests? So you're saying that because you're CPU bottlenecked, only your GPUs determine performance? The CPU isn't limiting the GPUs and isn't causing low minimum framerates?
That is a completely jumbled up question.
Make it more understandable if you want an answer.
Quote:
Originally Posted by B!0HaZard;13269665
The Metro 2033 benchmark is known to not give reliable minimum framerates. It's a faulty benchmark.
You are welcome to prove that.

Quote:
Originally Posted by B!0HaZard;13269665
So first you said that the CPU doesn't affect minimum framerates and then you say it does in Dirt 2? If the FPS is constantly 30 FPS below average then it's logically not the same at different clock speeds. With an avg FPS of 60, you'd get 30 minimum and with an avg of 100, you'd get 70 minimum. You just said that it DOES in fact depend on the CPU.
Again, you do not understand.
If the average vs minimal FPS relation stays the same, the CPU bottle-neck does not affect lower FPS, but it affects the general FPS of the game.
You stated that the minimal is what proves or disprove, which is not true.

I like how you keep jumping to different ideas on the same think line without keeping it straight.
Quote:
Originally Posted by B!0HaZard;13269665
You just said that Surround pixels =/= single monitor pixels. Whether or not you're using multiple displays isn't relevant to the test. All the GPUs are seeing, is a single monitor, albeit at 5760x1080. The GPUs don't know that the pixels are divided between 3 displays. A pixel is a pixel. Surround at 5760x1080 puts a higher load on the GPUs than a single monitor at 2560x1600, therefore being less dependent on CPU, therefore being relevant for the test because the CPU will need to be OC'd less to achieve max possible FPS with the GPUs in question.
I will say it again.
Surround is no single monitor.
Drivers work differently. Yes, cards see just one monitor when making the image, but the drivers don't, and it affects the results. If you don't understand that, you don't understand surround at all.
Also surround is is almost 50% more pixels than single 2560x1600. So how can you even compare?

The stress on the GPUs will make them the bottle-neck, instead of the CPU.
Its like putting a 50kg brick on a kid and see how he plays. Completely useless to this review.
The GPUs being dependent on the CPU is EXACTLY what this review is.
Not something you are trying to throw into it. Its completely missing the point
Quote:
Originally Posted by B!0HaZard;13269665
You just said that using FC2 (a game) would give you a false result not related to what you're doing (testing performance in games) because it uses a different engine than the other games (which all have different engines). Besides that, you said that including more results would ruin everything (which it would not as the rest of the results are still there).
This is not what I said.
I don't know what you read, but clearly its from another dimension.

Quote:
Originally Posted by B!0HaZard;13269665
I apologize for my earlier post, but I've tried to reason with you and you obviously don't know how game performance works, how CPU and GPU work together or how to read your own results.
Don't really accept your apology.
Calling out like that is bad taste. Its stupid and shows very little character about you.
And you clearly have more to learn than me about CPU and GPUs from all that you wrote. So I'm apologize if I'm not taking you as a teacher of such things.
 
#20 ·
Quote:
Originally Posted by Defoler;13269881
That is a completely jumbled up question.
Make it more understandable if you want an answer.
I honestly don't care, they aren't important.
Quote:
Originally Posted by Defoler;13269881
You are welcome to prove that.
I can possibly do that tomorrow. I'd recommend that you go here and look at the graphs. You'll notice that they have a lot of very sudden, very short dips. These are stutters that seem to give very low minimum framerate despite the fact that the "actual" minimum framerate (e.g. the lowest framerate caused by CPU/GPU bottlenecks) is pretty high.
Quote:
Originally Posted by Defoler;13269881
Again, you do not understand.
If the average vs minimal FPS relation stays the same, the CPU bottle-neck does not affect lower FPS, but it affects the general FPS of the game.
You stated that the minimal is what proves or disprove, which is not true.
But there's no such thing as a "general FPS". The CPU bottleneck gives you low minimums as well as lower averages, but the minimums may vary far more than the averages as can be seen in my "Will your aging dual core bottleneck your graphics card?"-review.
Quote:
Originally Posted by Defoler;13269881
I like how you keep jumping to different ideas on the same think line without keeping it straight.
There are many variables in these things. I'm actually addressing the issues one at a time, but your confusion doesn't surprise me.
Quote:
Originally Posted by Defoler;13269881
I will say it again.
Surround is no single monitor.
Drivers work differently. Yes, cards see just one monitor when making the image, but the drivers don't, and it affects the results. If you don't understand that, you don't understand surround at all.
Also surround is is almost 50% more pixels than single 2560x1600. So how can you even compare?

The stress on the GPUs will make them the bottle-neck, instead of the CPU.
Its like putting a 50kg brick on a kid and see how he plays. Completely useless to this review.
The GPUs being dependent on the CPU is EXACTLY what this review is.
Not something you are trying to throw into it. Its completely missing the point
I didn't say Surround is a single monitor.
Drivers should make little difference in performance, it's still just pixels. There can be problems in some programs though.
I said that 5760x1080 is a higher resolution than 2560x1600 and that it's therefore even better for what I'm trying to show you, yet you seem to have skipped that part.
Quote:
Originally Posted by Defoler;13269881
This is not what I said.
I don't know what you read, but clearly its from another dimension.
You're welcome to prove that.
Quote:
Originally Posted by Defoler;13269881
Don't really accept your apology.
Calling out like that is bad taste. Its stupid and shows very little character about you.
Agreed, but I got mad.
Quote:
Originally Posted by Defoler;13269881
And you clearly have more to learn than me about CPU and GPUs from all that you wrote. So I'm apologize if I'm not taking you as a teacher of such things.
Are you a troll?
 
#21 ·
Quote:
Originally Posted by Defoler;13269335
Understood
smile.gif


But that is what I was going for.
I wanted to make sure that what I can test, a game engine will not affect as much on the performance (except warhead).

I didn't want to ponder whether the increase in FPS was because the game runs better, or the GPUs has more room to breath.

This is why the single card is mainly included. Games which scale very little in single card or 2-way SLI, will show the real effect of CPU on the GPU/Drivers.
Also the reason why the FPS results are pretty high.
Because I didn't want to make the GPUs the bottle-neck.
I'm not really getting where you're coming from on all this I'm afraid.

Esp. not the part about making the GPU's not the bottleneck. If that's what you truly wanted, you should've run everything with settings all on Low at 800x600.

But that's okay ... as long as you get what I'm saying, cool
thumb.gif


I don't really wanna get involved (for once
wink.gif
) in this little debate between you two, but I will say this: I'm almost positive that the way that the Metro bench calculates min and max fps is based on individual frames, not by entire seconds worth of frames.

IOW, the tool looks at the frametimes and calculates a MIN fps based on THE SINGLE FRAME that took the longest to render over the course of the test.

Whether or not this makes it more or less 'reliable' I suppose is a point that could be debated ... it's certainly different from how most benchmarks operate ... but like I say ... I'm nearly positive that's what the bench is doing.

Similarly, it calculates a MIN and MAX fps for each individual second based on the frametimes captured during the course of each second ... that's what all the extra lines are on the graph extending above and below the main line that's charting average FPS per second.
 
#22 ·
Quote:


Originally Posted by B!0HaZard
View Post

I didn't say Surround is a single monitor.
Drivers should make little difference in performance, it's still just pixels. There can be problems in some programs though.
I said that 5760x1080 is a higher resolution than 2560x1600 and that it's therefore even better for what I'm trying to show you, yet you seem to have skipped that part.

No, you actually said that surround and single monitor is exactly the same, and now you are trying to cover yourself saying that you didn't say that.
Odd how it goes isn't it?

Quote:


Originally Posted by B!0HaZard
View Post

Are you a troll?

Are you asking yourself that question?
You seem more wanting to write stuff than listen.
 
#26 ·
very nice thread! i thought crysis (1) is all about GPUs. now that i see the CPU makes a huge deal i'll run it at my 3.5 GHz (turbo) instead of my stock 2.8 GHz (turbo).
added rep.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top