Overclock.net banner

[gamegpu] Plants vs. Zombies Garden Warfare GPU test

10K views 90 replies 33 participants last post by  mtcn77  
#1 ·
AMD DX11:


Nvidia DX11:


AMD Mantle:
Quote:
Plants vs Zombies Garden Warfare - a multiplayer third-person shooter where players will drop to manage a variety of plants in opposing the invasion of zombies, for whom also play users. Matches to support participation of 24 players. The project features different character classes, the proper use of which will help the team in survival.
Source
Source (translated)

Yet another title that shows that while mantle gives really nice gains NV's DX11 implementation isn't far off. More results and info at source.
 
#4 ·
Sure it's not a GPU Melting title, but I for one saw these results as quite interesting.

There's countless games that people play using similar graphics and if Mantle can already hit these gains on a lower setting, then it should be able to match DX at the high end of the market given enough development. Afaik it's still a Beta after all.

Sign me up as keen
biggrin.gif
 
#5 ·
Interesting to see the 2500k start slipping behind slightly after all this time even compared to the higher clocked AMD's... Nothing a 4ghz+ OC wouldn't fix but still.. At least I can say my 9590 is quicker than the 2500k (at stock) for once!
tongue.gif
Speaking of which, it's interesting how the 8350 performs better than the 2500k with the NV drivers but not mantle nor DX.

edit: and LOL at the 8150 getting trumped pretty badly even by a 6300.. regardless of PD being a refresh it's gains over BD were quite noteworthy
 
#9 ·
Quote:
Originally Posted by sumitlian View Post

Man, AMD DX drivers are literally horrible than Nvidia.
When mantle released I was impressed at how good it was. Now you see Nvidia's regular drivers getting close to mantle performance but this works in all games and not just the hand full of mantle supported titles.

Now when you look at results like this you end up thinking "wow AMD's DX drivers are terrible" instead of "wow Mantle is amazing". I don't think this was AMD's intention.
 
#10 ·
Hmm, Mantle is really good for bringing up the minimum FPS.

Also, Nvidia has a pretty sizable advantage, 20%+, for the CPU overhead in DX11. That said, it doesn't seem like something that is impossible to overcome. Surely, AMD could do it if they put more resources into DX driver development
 
#11 ·
Quote:
Originally Posted by Derp View Post

When mantle released I was impressed at how good it was. Now you see Nvidia's regular drivers getting close to mantle performance but this works in all games and not just the hand full of mantle supported titles.

Now when you look at results like this you end up thinking "wow AMD's DX drivers are terrible" instead of "wow Mantle is amazing". I don't think this was AMD's intention.
Not sure if the "You" you're implying in your sentence is toward me or its for everybody.
Anyway I'll take it with ease
smile.gif

As far as I know and what I feel about how all this is happening because when Mantle was launched and results were compared to Nvidia's legacy DX drivers, no body in the world was ever able to believe that actually AMD's drivers were bad comparably. Everyone (including me) was thinking about 9x draw calls improvement, which obviously seemed more faster rendering solution to everyone.
.
.
And when Nvidia's wonder driver came and showed its result, everyone got a hint that the exact problem might not be with just DirectX (or what they were calling DirectX overhead), because if nvidia can do it why can't AMD do it either ?
And after looking these benchmarks where you can clearly see Nvidia GPU needs considerably less CPU power to run optimally than an AMD GPU, It is proved that AMD either has been lazy with DirectX optimization or their GCN architecture at hardware level has never been fully compatible with DirectX.
 
#12 ·
Quote:
Originally Posted by SoloCamo View Post

Interesting to see the 2500k start slipping behind slightly after all this time even compared to the higher clocked AMD's... Nothing a 4ghz+ OC wouldn't fix but still.. At least I can say my 9590 is quicker than the 2500k (at stock) for once!
tongue.gif
Speaking of which, it's interesting how the 8350 performs better than the 2500k with the NV drivers but not mantle nor DX.

edit: and LOL at the 8150 getting trumped pretty badly even by a 6300.. regardless of PD being a refresh it's gains over BD were quite noteworthy
Huh? Slipping behind? its a 3 FPS diff, with the AMD chip clocked way higher......I would not call that slipping. Keep in mind also how old is SB now? And how much did both cost at launch? Yea, not something id be proud of......
 
#13 ·
Quote:
Originally Posted by sumitlian View Post

With Nvidia GPU, 4770k and 6300 are almost equal.
But with AMD GPU, 4770 is 43% faster than 6300. Man, AMD DX drivers are literally horrible than Nvidia.

And looks like mantle is doing fine what it was made for, 80% more fps with Mantle (FX 4100).
Certianly an odd thing indeed considering this is exactly the other way around when we look at AMD openCL vs Nvidia CUDA where CUDA is extremely CPU bound openCL does barely touch the CPU.

I don't think Nvidia does this good in most cases actually their wonder drivers didn't work any wonders for me. Low level tweaks could also be in place it would be no new thing to leave out some detail for both of the companies.
 
#14 ·
Quote:
Originally Posted by maarten12100 View Post

Certianly an odd thing indeed considering this is exactly the other way around when we look at AMD openCL vs Nvidia CUDA where CUDA is extremely CPU bound openCL does barely touch the CPU.
Agree with you totally, I thought the same, but here its DirectX API, not OpenCL or Cuda.
Quote:
I don't think Nvidia does this good in most cases actually their wonder drivers didn't work any wonders for me. Low level tweaks could also be in place it would be no new thing to leave out some detail for both of the companies.
Thanks for providing real world experience. I was very skeptical of this too, but by looking this now It seems wonder driver has become even more mature, and if its with Frostbite 3, then it is supposed to perform same with upcoming FB3 games. And whats the point of having a low level API and performing almost same to nvidia's dx optimization specially when you(AMD) initially called DX an overhead.
What I don't understand how in the world all of a sudden AMD got an idea to go through Mantle ? I mean they must have been facing some serious DirectX optimization issues with their VLIW and GCN arch !?
 
#15 ·
Quote:
Originally Posted by maarten12100 View Post

Certianly an odd thing indeed considering this is exactly the other way around when we look at AMD openCL vs Nvidia CUDA where CUDA is extremely CPU bound openCL does barely touch the CPU.

I don't think Nvidia does this good in most cases actually their wonder drivers didn't work any wonders for me. Low level tweaks could also be in place it would be no new thing to leave out some detail for both of the companies.
That sounds fundamentally wrong. Do you have a link for that?
 
#17 ·
Quote:
Originally Posted by TFL Replica View Post

That sounds fundamentally wrong. Do you have a link for that?
Quote:
Originally Posted by anubis1127 View Post

I tend do agree with @dman811
. Picking up used AMD GCN 1.0 cards is the way to go right now for PPD/$$$. Even 7850s / R7 265s are a pretty nice option due to only needing one 6-pin power cable, and they do around 75-85k PPD per card. Get 4, or 5 of those in a rig, and you're looking at upwards of 350k PPD.

Plus the nice thing about AMD cards is the OpenCL implementation on them doesn't take much CPU at all, where on the NV side you pretty much need to dedicate 1 fast CPU thread per GPU, by fast I mean 3.5Ghz sandybridge or newer.
I think the folding editor knows his stuff. well I hope
redface.gif

Quote:
Originally Posted by chemicalfan View Post

So, where are the Intel benches? Haswell's iGPU is capable now (compared with Sandy & Ivy), and considering PvZ is a pretty casual game, it's entirely possible that the target audience wouldn't have a dedicated GPU.
Capable?
Yeah maybe at 1024x720 because that is what I played Bioshock infinite at low. With the horrible drivers and abundant glitches I don't think that is considered able. But if you want to game on it you can just reduce settings a lot. (if a game works on it I use it on the go for it is less hard on the battery that the dGPU)
 
#18 ·
You guys can bag on Mantle all you want but the way I look at it is this:

If AMD had never come out with Mantle, nVidia would never have optimized their DX11 drivers to compete with it and M$ would never have talked about DX12. Ever. I believe AMD mainly used Mantle to shake up the industry and thought if they could establish another solid graphics API along the way then they would. I for one am very thankful they did SOMETHING to cause some ripples where no one has rocked the boat for YEARS.
 
  • Rep+
Reactions: Cyro999
#19 ·
Quote:
Originally Posted by LazarusIV View Post

You guys can bag on Mantle all you want but the way I look at it is this:

If AMD had never come out with Mantle, nVidia would never have optimized their DX11 drivers to compete with it and M$ would never have talked about DX12. Ever. I believe AMD mainly used Mantle to shake up the industry and thought if they could establish another solid graphics API along the way then they would. I for one am very thankful they did SOMETHING to cause some ripples where no one has rocked the boat for YEARS.
They are wrong to begin with I mean this game is not cpu bound and is a mainstream title by EA there has been a huge countdown on origin for a reason (yes I saw it while starting BF3 lol)
 
#20 ·
Quote:
Originally Posted by maarten12100 View Post

They are wrong to begin with I mean this game is not cpu bound and is a mainstream title by EA there has been a huge countdown on origin for a reason (yes I saw it while starting BF3 lol)
Oh, I don't see a countdown on Origin, but then again I'm not on the client. I'm not at home with my main PC, I've only got my Linux laptop right now. One week from today I'll be home though, then it's on like Donkey Kong!
 
#21 ·
Quote:
Originally Posted by maarten12100 View Post

I think the folding editor knows his stuff. well I hope
redface.gif
I see no reason to doubt the Folding Editor's word, but do note that he's only referring to two specific F@H implementations. It is just logically unsound to assume that this info can be applied to OpenCL/CUDA as a whole.
 
#22 ·
Quote:
Originally Posted by geoxile View Post

Hmm, Mantle is really good for bringing up the minimum FPS.

Also, Nvidia has a pretty sizable advantage, 20%+, for the CPU overhead in DX11. That said, it doesn't seem like something that is impossible to overcome. Surely, AMD could do it if they put more resources into DX driver development
The 181 min fps on the various CPUs using NV GPUs seems more like some sort of a GPU limitation than anything to do with cpu overhead. If the game was still somewhat CPU bound for the minimums you'd see higher end CPUs getting higher minimums than lower end ones.
 
#23 ·
So Basically in this game:

AMD R9 295x2 with Mantle at 4K: 91 Min, 102 Avg
Nvidia 780ti SLI at 4K: 74 Min, 83 Avg

R9 295x2 has higher minimum FPS than 780ti SLI Average FPS
biggrin.gif


AMD R9 290x with Mantle at 4K: 49 Min, 54 Avg
Nvidia 780ti at 4K: 36 Min, 44 Avg

R9 290x has higher minimum FPS than 780ti Average FPS
biggrin.gif
 
#24 ·
Quote:
Originally Posted by anujsetia View Post

So Basically in this game:

AMD R9 295x2 with Mantle at 4K: 91 Min, 102 Avg
Nvidia 780ti SLI at 4K: 74 Min, 83 Avg

R9 295x2 has higher minimum FPS than 780ti SLI Average FPS
biggrin.gif


AMD R9 290x with Mantle at 4K: 49 Min, 54 Avg
Nvidia 780ti at 4K: 36 Min, 44 Avg

R9 290x has higher minimum FPS than 780ti Average FPS
biggrin.gif
I think you're looking at a different chart from everyone else.

The 3rd chart shows 4K at 83 for both camps?
 
#26 ·
Quote:
Originally Posted by Dimaggio1103 View Post

Huh? Slipping behind? its a 3 FPS diff, with the AMD chip clocked way higher......I would not call that slipping. Keep in mind also how old is SB now? And how much did both cost at launch? Yea, not something id be proud of......
Let me make this clear, I DO NOT CARE about the AMD chip being clocked higher (I even indicated the performance can be fixed with a small oc on the 2500K), I DO NOT CARE about how old SB is now and quite frankly I DO NOT CARE how much they cost when they came out. I'm talking about the here and now. Where did I indicate that anyone is proud, let alone myself as well? All of these things will take this thread down the amd vs intel path which I have no interest in debating at this point.

That all said, my point went entirely over your head. In my opinion the 2500k has been the standard go to reference cpu for performance for a very long term in game benchmarks so to see it showing lower minimum and avg framerates than a 9370/9590 in DX was shocking to me. With nvidia's DX it clearly was a gpu bound/engine bound situation capping them all 181/200fps at the top, including a stock 8350, and yet the 2500k was about 10fps slower on average and 3fps slower on minimums. And of course with mantle it shows a slightly lower minimum but matches the 200fps mark for average.

A simple observation.