Overclock.net banner

which games are generally GPU intensive and which are CPU intensive?

61K views 23 replies 15 participants last post by  imran27  
#1 ·
It's a common knowledge that some games are especially CPU intensive(for example GTA IV, MS Flight Simulator X) and some are more GPU intensive(for example Metro 2033 or Dirt 3). What makes a game CPU intensive? Is it mostly NPC's and difficult physics(good example here would be GTA IV) which are mainly processed by CPU? Which operations exactly are processed by GPU?
 
#3 ·
Most games are GPU dependent at the moment.
Besides..
Starcraft 2, World of Warcraft, CIV 5, Battlefield 3, GTA 4, Flight Simulators, Bad Company 2, basically most modern RPG, RTS, and free roaming games.
 
#4 ·
Ok, but reality at the moment is that much of the for example physics computing is performed by CPU while some modern graphics cards would support it thanks to PhysX(software development kit) and GPGPU capabilities of modern GPUs? In addition, at the moment PhysX seems to be supported only by NVIDIA GPU's which would leave a system with only AMD GPU's to two options- turn off the game-engine support for physics computing or have a very fact CPU?
 
#5 ·
Quote:
Originally Posted by hazarada View Post

Generally games made by lazy devs and games with dx9 support are more cpu intensive. Nowdays gpu shaders are so programmable that nearly all number crunching can be offloaded to them wih speed benefits, when the game industry will catch up is another question.
It's not laziness; not all number crunching is appropriate for GPUs.
 
#7 ·
A large part of the equation when it comes to this subject depends on the settings you run the game at, and what your hardware actually is. Any game can be made to be 'cpu dependent' by lowering settings enough that the FPS gets high enough that the CPU has a hard time keeping up. Obviously a lower powered CPU coupled with a top-end SLI setup would make more games seem cpu-dependent since FPS would be higher.

Conversely, cranking up your settings and running really large resolutions with an underpowered GPU and a crackin' great CPU will make more games appear GPU-dependent.

All due respect to Aslan, but the idea that Source games are generally CPU-dependent is not accurate IMHO. 4 year old dualies can run most Source games at like 200+FPS. The only reason they might seem cpu-dependent is cause a great CPU will run them at 400FPS instead of 200FPS. That's not what I call cpu-dependent. When a good CPU can keep you at 30fps and a great CPU gets you to 60fps ... THAT is a cpu-dependent game. Crysis and GTAIV are about the most extreme I can think of ... which is funny, because when Crysis came out it was the most GPU-dependent game out there.

Examples like this is part of why I say ... it depends a lot on your gear
wink.gif
 
#8 ·
Quote:
Originally Posted by XAslanX View Post

Agreed, most Source games are CPU intensive.
Yup. Source games and Alan Wake (is that Source..?) will have my 460 at ~ 65-75% <-- Dead Space, too.

BF3, same cpu clocks, 80-99%.
 
#9 ·
Quote:
Originally Posted by hazarada View Post

Generally games made by lazy devs and games with dx9 support are more cpu intensive. Nowdays gpu shaders are so programmable that nearly all number crunching can be offloaded to them wih speed benefits, when the game industry will catch up is another question.
The games that are CPU intensive (with the exception of Crysis 1 and a few others) are games that have been sloppily ported to PC from console.

Quote:
Originally Posted by SectorNine50 View Post

It's not laziness; not all number crunching is appropriate for GPUs.
No. The GPU can compute data MUCH faster than the CPU.
 
#10 ·
Quote:
Originally Posted by Xaero330 View Post

The games that are CPU intensive (with the exception of Crysis 1 and a few others) are games that have been sloppily ported to PC from console.
No. The GPU can compute data MUCH faster than the CPU.
What the GPU can calculate is the difference.

Otherwise all of our CPUs would be GPUs. Think about it.
 
#12 ·
Quote:
Originally Posted by SectorNine50 View Post

It's not laziness; not all number crunching is appropriate for GPUs.
like what? other than fetching files and barking orders at the gpu what serial computation do you encounter in games? Traditionally bone matrices and hierarchies have been difficult to fit in the gpu efficiently since the dx9 pipe starts further down the road but with SM4+ this is no longer a problem. Physics has been moving towards gpu's for a while now but tbh i dont understand why they use crap tech like physx or havok if the fully programmable shader is right friggin there. AI - again used to be difficult to implement on gpu's due lack of scatter/gather type memory access, but again with the advent of sm4 no longer a problem.

6 years guys, thats how long dx10/11 style shader pipe has been available and they are still making games for dx9 with dx11 maybe as a little fluff on the top while the core features of the two are radically different and allow for much more.
 
#13 ·
Quote:
Originally Posted by hazarada View Post

like what? other than fetching files and barking orders at the gpu what serial computation do you encounter in games? Traditionally bone matrices and hierarchies have been difficult to fit in the gpu efficiently since the dx9 pipe starts further down the road but with SM4+ this is no longer a problem. Physics has been moving towards gpu's for a while now but tbh i dont understand why they use crap tech like physx or havok if the fully programmable shader is right friggin there. AI - again used to be difficult to implement on gpu's due lack of scatter/gather type memory access, but again with the advent of sm4 no longer a problem.
6 years guys, thats how long dx10/11 style shader pipe has been available and they are still making games for dx9 with dx11 maybe as a little fluff on the top while the core features of the two are radically different and allow for much more.
The only time a GPU is more effective at math than a CPU is when the threads are extremely parallel.

Many GPU accelerated operations are effective because massive amounts of data needs to be calculated that don't rely upon another thread's "answer."

Physics would make sense on a GPU because each particle has it's own path that needs to be calculated. AI is hit an miss. If you have lots of AI objects that don't rely up on each other, a GPU is appropriate. However, if you have a single string of logic that needs to be executed, then a CPU is going to be better.

Plus, think about all the background processes required to bring all of this together. CPU/GPU usage really does vary greatly on the application.
 
#14 ·
Quote:
Originally Posted by SectorNine50 View Post

The only time a GPU is more effective at math than a CPU is when the threads are extremely parallel..
as of DX11 with multiple UAV's available they dont even need to be parallel anymore, one shader core could be calculating ai while another does global illumination or whatever. And its not like you only have one AI instance in a game at any given time or one of anything, they all come in bulk these days.
 
#15 ·
Quote:
Originally Posted by hazarada View Post

as of DX11 with multiple UAV's available they dont even need to be parallel anymore, one shader core could be calculating ai while another does global illumination or whatever. And its not like you only have one AI instance in a game at any given time or one of anything, they all come in bulk these days.
I understand what you can do, but it's what is faster for a given scenario.

That single thread calculating AI on a GPU would go much, much slower than a single thread calculating AI on a CPU.

Like I said, GPUs are faster at calculating massively parallel data (because of the number of processors), CPUs are better for linear (because of how fast each processor core is).

It hugely varies on the application. You can't assume that all calculations pertaining to AI or physics should be done on the GPU.

Then you also have to do some balancing. How much GPU usage do you really want to assign to non-graphical tasks? Now days, CPUs are so powerful, you'd be silly not to use it in coordination with the GPU.
 
#17 ·
Quote:
Originally Posted by SectorNine50 View Post

How much GPU usage do you really want to assign to non-graphical tasks? Now days, CPUs are so powerful, you'd be silly not to use it in coordination with the GPU.
One could argue that since the main function of a game engine is to deliver the picture of the game world to the screen all of it is a graphical task
tongue.gif
i dont think it is fair to look at gpu's as just something to calculate graphics oriented things with, with the last half a decade they have gone far beyond that - anything that can be efficiently broken into independent chunks is well suited for gpu. GPU's might be slow on long chains of calculations but as far as games are concerned it doesn't matter how slow it is as long as its done in 16ms(60 fps), if there is such a task that the gpu just cant finish in that time period then its time to hand it over to the cpu. Also lets not forget, currently GPU's outmatch CPU's in raw float performance 20 to 1 - imo gpu should serve as the main computational resource with the niche tasks like time sensitive difficult threads left to the cpu.
 
#18 ·
I have noticed that the CPU intensive games mentioned (Starcraft II, and RPG's) have great framerates with just a lowly core I3. I am still using a G92 (9800gt) for graphics
 
#19 ·
I know I'm a bit late, but I wouldn't call GTA IV CPU intensive. I've got an AMD Phenom II x4 @ 4GHz, and GTA IV runs at about 25 fps, 15 when recording, with my AMD Radeon 6670, whereas on my 7870, I'm getting 50-60 FPS, and 40 recording. Dirt 3 is more likely CPU intensive, due to the fact that it runs at a constant 30 fps whether or not I'm recording. (Haven't tried Dirt 3 with my 7870 yet)
 
#20 ·
Quote:
Originally Posted by guitar42697 View Post

I know I'm a bit late, but I wouldn't call GTA IV CPU intensive. I've got an AMD Phenom II x4 @ 4GHz, and GTA IV runs at about 25 fps, 15 when recording, with my AMD Radeon 6670, whereas on my 7870, I'm getting 50-60 FPS, and 40 recording. Dirt 3 is more likely CPU intensive, due to the fact that it runs at a constant 30 fps whether or not I'm recording. (Haven't tried Dirt 3 with my 7870 yet)
Just because a better GPU gave you a better frame rate doesn't mean its not CPU intensive. You were/are both GPU and CPU limited in your case(with the phenom II). GTA IV is extremely CPU intensive due to lazy porting over from the consoles. If you were to throw a better CPU than yours at it you would see your FPS skyrocket even more. This game and its flaws have been discussed a million times over the net. Its unfortunate it suffers from this because its a decent game that cant be fully enjoyed on pc due to this. It has garbage visuals that shouldn't require the power it does but once again thank lazy rockstar who wanted to make a quick buck.
 
#21 ·
there's too many variables and every game will be different, a general trend though, multiplayer and openworld games will be more CPU intensive compared to Cod single player or a racing game like Grid2 that is offline or doesn't have to load lots of things at once.
 
#23 ·
Quote:
Originally Posted by SectorNine50 View Post

What the GPU can calculate is the difference.

Otherwise all of our CPUs would be GPUs. Think about it.
True IMO. Since CPUs are more designed to for better handling/processing of data unlike a GPU which is designed to use huge chunks data for intense calculations like multidimensional vectors, matrices etc.
CPU & GPU are too complicated for we people to make fair comparison. GPUs are basically highly parallel and very highly efficient DSP (Sigital Signal Processors) which are different from CPU.

Quote:
Originally Posted by _02 View Post

It depends on how the developer implements the system to use the resources.

If your game (like in Frostbite) uses a software engine to handle audio, the CPU is taking that load.
I think it depends upon the software. But in general YES, software rendering is CPU dependent and where even your RAM plays a major role
biggrin.gif

Quote:
Originally Posted by SectorNine50 View Post

The only time a GPU is more effective at math than a CPU is when the threads are extremely parallel.

Many GPU accelerated operations are effective because massive amounts of data needs to be calculated that don't rely upon another thread's "answer."

Physics would make sense on a GPU because each particle has it's own path that needs to be calculated. AI is hit an miss. If you have lots of AI objects that don't rely up on each other, a GPU is appropriate. However, if you have a single string of logic that needs to be executed, then a CPU is going to be better.

Plus, think about all the background processes required to bring all of this together. CPU/GPU usage really does vary greatly on the application.
Why can't GPU make serially dependent calculations independently, and if you say its more CPU oriented then why? Because even CPUs are parallel nowadays like the one I have is parallel with 8 threads (though these 8 threads run on CPU cores not shaders
wink.gif
).
Quote:
Originally Posted by hazarada View Post

as of DX11 with multiple UAV's available they dont even need to be parallel anymore, one shader core could be calculating ai while another does global illumination or whatever. And its not like you only have one AI instance in a game at any given time or one of anything, they all come in bulk these days.
thumb.gif
 
#24 ·
A game is a combined effort of CPU+GPU, logically speaking, as someone up here also pointed out, a slower part would let you feel that the game depends on it.

Maybe it just depends on what bottlenecks your system, CPU or the GPU.

It is like for eg. a formula-1 (GPU) and an SUV (CPU) running at top speeds and the fuel (data) is supplied to the formula-1 by the SUV. Imagine what'll happen, formula-1 would burn the fuel run at top speed and wait for the SUV, the SUV would then reach up to F-1 and resupply the fuel. If the SUV is a Hummer and they are fast then SUV replaced by a Ferrari would make them even faster.

I hope it's a good example
biggrin.gif