AMD A10-5700 Reviewed: Framerates, Frametimes, and Playability. Uploading 13.2b4 stock! - Overclock.net - An Overclocking Community

Forum Jump: 

AMD A10-5700 Reviewed: Framerates, Frametimes, and Playability. Uploading 13.2b4 stock!

Reply
 
Thread Tools
post #1 of 85 (permalink) Old 01-12-2013, 06:23 AM - Thread Starter
New to Overclock.net
 
Artikbot's Avatar
 
Join Date: Jul 2010
Location: Catalunya
Posts: 14,959
Rep: 611 (Unique: 407)
Hello!


I stumbled across Benchmark3D's post about frametime measuring and if framerate really reflects playability, and decided to give that testing method a go, and use it as an excuse to review my A10-5700's gaming performance.


So, let me elaborate:


The purpose of this thread is to analyze if AMD's new generation of Trinity APUs are really usable to play games. And I don't mean if they are capable of producing playable framerates, but if they can kick playable frametimes.

Because it's absolutely worthless to get 60 FPS if you get one every 10 frames over 500ms, ans you will get an unplayable stuttering. thumb.gif

Why do I do this? Well, because nobody has yet tested AMD's APUs from the frametime standpoint, something I consider very important since they are extremely bandwidth bound, as they use the system memory as a framebuffer.


Without further delays, this is my testing setup:

  • AMD A10-5700
  • Gigabyte F2A75M-HD2
  • 2x4GB GSKILL Ares
  • Hitachi 5K750 500GB 2.5"


And these are the parameters each component is set at:

  • CPU: 3.4GHz @1.12V, Turbo and APM disabled. C1, C6 and CnQ enabled. Why turbo disabled? Because I want to smile.gif
  • GPU: 760MHz.
  • Memory settings: DDR3-1866, 8-10-9-27-36 2T 1.58V. Can go tighter, but doesn't make much of a difference (since it's 8-9-8).
  • Monitor: 1920x1080, 60Hz.


As a clarifying note, some people have asked why have I disabled Turbo core. Well these are the reasons.
-Increases CPU power consumption around 20W more that my custom settings under heavy stress.
-Makes the system shut down after a while (PSU overheating).
-Only improves performance around a 5%.

I have also tried to overclock the processor to 3.5GHz across all cores, and it increases power consumption by a 10% (from 60 to 67W) in 3DMark Vantage, all of this while marking a 1% performance improvement (from P5815 to P5860), something I could achieve by increasing the GPU clocks 10MHz. Absolutely not worth it, and even less on a power constrained platform such as mine.


Now to the actual tests! wheee.gif
Catalyst 12.11b (Click to show)
DiRT2:

Testing here is done in two situations that make different usage of the GPU.
1st scenario is in Morocco, on a Rally event. Second one is on Japan, in a Rallycross event.
Why this distinction? Because Morocco makes extensive use of vegetation in the ground, and there are lots of cast shadows by buildings. And Japan, because it's a urban RX track with seven other racers, and lots of changing surfaces that make extensive use of physics calculations.

Everything set to high except postprocessing, which is at Medium. No AA, 8X anisotropic filtering.

Morocco:



Japan:




The Elder Scrolls IV: Skryim:

Testing is done inside Ilinalta's Deep (a necromancer cave), because it features lots of water and lots of particles. I also played my mage, so I'm making extensive use of fire. Testing here is done setting manual core affinity to the 1st thread from the second module, as it is proven it gives the smoothest experience.

Settings as follows:






Just Cause 2:

Using the built-in benchmark (Concrete Jungle), nothing too fancy.
Settings as follows:





As you can see it has super high stuttering; and after trying to get adequate framerates, I concluded that even setting everything to Low does not help, framerates stay the same. There is definitely something strange with this game. Note that I can play just fine, for some reason.


GTAIV: Episodes from Liberty City:

Again, the bundled benchmark tool from The Lost and Damned.
Settings as follows:






Mafia 2:

And once more, the benchmarking tool.
These are the settings:






The peaks from the 1st third of the test correspond to mini freezes, that for some reason happen in the benchmark but I've never experienced ingame.


Guild Wars 2:

Taking a stroll around Plains of Ashford, killing some enemies here and there.
This is how it's set up:






Despite the very apparent stuttering, the game was playing very smooth. Seems like frametimes, while having lots of variations, never take long enough to notice a stutter.


Planetside 2:

Simulating a field fight, lots of vehicles involved, some infantry, taking a couple of scoped shots. Playing as a NC Infiltrator, CQ sniping.
Render quality at 100%, everything else at flat out Low. Either way it is unplayable, thanks to stupid SOE who decided that forcing Antialiasing was a good idea.




Sniper Elite V2:

The benchmark tool, too.
Configured like this:






Team Fortress 2:

Everything at flat out maximum, including Antialiasing and anisotropic filtering.



It simply tears through this game like a hot knife through butter. That one spike corresponds to a death tongue.gif


Far Cry 3:

Settings bottomed down. Nothing else. Ran at DX11 and DX9, for comparison:

DirectX 9:



DirectX 11:




Performance is less than stellar (albeit arguably playable depending on your style), and ironically, it only gets worse with DX9. The game also looks worse, so there's no reason as to why someone would want to use the DX9 mode.


Chivalry: Medieval Warfare

Testing was done against 7 bots in a TDM match in Throneroom, as I find it's a rather nice scenario. Framerates in Arena will be higher, lower in some huge Team Objective maps.

Settings as follows:





There is A LOT of stuttering, but overall the game is pretty playable. You do notice the stutter though.

And here it is with MUCH higher settings, needless to say it's night and day in looks.





Borderline playable, but still quite there. Logically, the stuttering is also quite higher.


3DMark Vantage:

Nothing too fancy, Performance preset, bare bones:



Not too bad! No frametime tests here, as I did this test for rough score, and FRAPS lowers it.

And there we go, test with Catalyst 13.2 Beta 4! wheee.gif

As to why have I switched to graph view, instead of bar view like before? Well because it lets you see better the actual frametimes. If you look closely in the bar graphs, it's also visible, but it's clearer in the line.
Catalyst 13.2 Beta 4b (Click to show)

DiRT2:

Morocco:



Japan:



DiRT2 once again shows its love for the APU. Framerates stay the same too. Exact same stuttering, aka none.


The Elder Scrolls IV: Skryim:



Same performance. Nothing to see here redface.gif

Just Cause 2:



Still dog poo. Although there is a welcome 7% framerate improvement. Translates in a 18.6 to 19.9 increase. Not a lot, but welcome. Also spikes like a fence.


GTAIV: Episodes from Liberty City:



Mafia 2:



Guild Wars 2:



Planetside 2:



Sniper Elite V2:



Framerates go slightly down, stuttering goes slightly up. Something isn't playing nicely here.




In conclusion, you can see that if you're not too picky about framerates (I know there's people who cannot stand less than 60FPS no matter what), Trinity plays just fine, with results varying with each game. Some can be straight maxed out, some have to be bottomed down in order to play. But so far no game I've played has been straight out unplayable.

Frametimes are, as you have seen, very variable, something I bet has to do with memory bandwidth. As you overclock framerates improve, but frametime differences stay the same, and become worse in some cases, better in others (games that are power starved, and not bandwidth starved).

Overall, I think you cannot ask more out of a platform that costs 250€ complete (board, processor and RAM), and I am pretty surprised that it is actually capable of managing such a performance.



So far, this is it smile.gif I will be adding more benchmarks as I fancy, but Metro 2033, Far Cry 3 and Crysis Warhead will get eventually done. I have to rip my DVDs to ISOs and transfer them to the Trinity, as I've got no ODD.


I take requests, but please note that I do not own most games out there, so there is little I can do in that aspect.

On the next post is the overclocking results!


Thanks for reading folks wink.gif


Artikbot is offline  
Sponsored Links
Advertisement
 
post #2 of 85 (permalink) Old 01-13-2013, 03:46 AM - Thread Starter
New to Overclock.net
 
Artikbot's Avatar
 
Join Date: Jul 2010
Location: Catalunya
Posts: 14,959
Rep: 611 (Unique: 407)
So, as I can guess, overclocking results are something you mates would be interested in... After all this is an overclocking community! thumb.gif


These are the current settings:

CPU: 3.36GHz, 113MHz BCLK, 1.8GHz NB link. 1.12V
GPU: 851MHz, stock voltage.
Memory: 2096MHz, CL9-11-9-28 1T, 1.67V.


Some CPUZ and GPUZ captures...




I'll be dropping here something I thought might be interesting to some. MaxxMEM2 runs at both stock and overclocked, so you see how the bandwidth and latencies actually improve!
MaxxMEM2 (Click to show)

Stock:



Overclocked:

And this is at 2100MHz CL9-11-10. Actual testing has been done at 9-11-9, but the scores are almost identical.





And onto the testing! Uploading as I get them done.

The display settings are absolutely identical to the ones above, so the comparison is 1:1. The situations are also identical, or at least, similar (in the cases where gameplay is involved, such as DiRT2, Skyrim, GW2, and Planetside 2).

Catalyst 12.11b (Click to show)
DiRT 2:

Morocco:



Japan:



Frametimes in DiRT2 have improved by a nice margin, stuttering has also gone down a bit. This game definitely likes the improved memory bandwidth!


The Elder Scrolls V: Skyrim:



Skyrim suffers from a whole different story. Framerates improve by a slight margin, but stuttering increases. Not by much, but enough to have a handful of situations where you can notice micro-freezes. Will look into this some other time.


Just Cause 2:



You can see most of the stuttering is gone, but there are still noticeable spikes every few seconds. Framerates have also improved by a nice margin.


GTA IV: Episodes from Liberty City:



There has been a slight improvement in TLAD, around the 5% framerate improvement, and some stuttering has gone away. The main peaks are still there, so the cause for it has to be found somewhere away from the IGP.


Mafia II:



There has been a slight improvement in framerates, but stuttering is, just like it was, relatively low. There are still apparent frame latency peaks, and that very noticeable freeze at the beginning of the benchmark.


Guild Wars 2:



Gameplay takes place in the very same area, killing the same mobs. But since there can be variations, don't analyse the graphs 1:1 to the above, but rather as an overall.

The framerate improvement is very large (in the boundaries of the 20%), and the stuttering, almost completely gone. The variations on the framerates mostly respond to sudden camera rotations, skill activations and so on.


Planetside 2:



In Planetside, a considerable framerate improvement is seen, as well as an overall reduction in stuttering. Mind you this second pass was done on a more demanding environment, an Amp station as opposed to a field; so the improvement is actually higher than what is reflected here.


Sniper Elite V2:





Sniper Elite V2 sees a huge drop in overall stuttering, but a rather low improvement in overall framerates. Still, the game becomes way more playable since stuttering now is almost unnoticeable. It seems apparent that bandwidth was the cause of the stuttering, although the GPU is responsible for the not too stellar framerates.


Team Fortress 2:


Settings ALL the way up. As easy as that. AA to 8X, AF to 16X, everything is at its absolute maximum.



TF2 is like a walk in the park for the IGP. It kicks beastly framerates and absolutely zero stuttering. That one 35FPS dip corresponds to a death, tongue.gif


Far Cry 3:


On Far Cry 3 I've found that the only settings that are something close to playable are flat out lowest. I kept the resolution at 1920x1080, though. DirectX9 and DirectX11 have been tested:

DirectX 9:



DirectX 11:




As you can see, there's flat out no reason as the game shouldn't be ran on DX11. Framerates are for the most part identical, and even though there's more latency spikes on DX11, they don't disrupt the gaming as much as one would think. The game does look prettier, take that for granted.

None of the options offer overly playable solutions, albeit framerates stay close to 30 FPS. With a corded, proper mouse the game is much more playable than on the wireless Logitech I'm using. Blame it on added latency from the link, on top of the already low framerates.

Nonetheless, I have been able to play for half an hour without much hiccups besides framerates dropping to low 20s under heavy action; so I was forced to hit, hide, hit, hide.

Chivalry: Medieval Warfare



Stuttering hasn't improved (it appears worse because I had more fights this second round), but framerates have gone up... By a 15%! The game is noticeably more playable. Unreal likes memory bandwidth, it seems.

And re-tested again with the same higher settings as above.



Not quite as smooth as before, but it's perfectly playable, and it looks far better. Stuttering is a bit higher, but not a lot.


3DMark Vantage:

Same situation as before. Performance preset, bare stock.



I see a major improvement here! I'm getting a 10% performance boost out of the blue. This is a very clear and contrastable index that performance does indeed benefit a lot from RAM overclocking.

Catalyst 13.2 Beta 4b (Click to show)

DiRT2:



Japan:




The Elder Scrolls IV: Skryim:




Just Cause 2:




GTAIV: Episodes from Liberty City:



Mafia 2:



Guild Wars 2:



Planetside 2:



Sniper Elite V2:






So, as you can see, overclocking the memory DOES help with stuttering in every single case (except Skyrim), and I also experienced a framerate improvement in every case, though largely variable depending on the game.

Please note that in this scenario I used the BCLK to boost the memory speed, and that resulted in an IGP speed boost, too. So, these results are the combination of both.


Hope it was interesting, and if you have any questions, suggestions, or petitions, please don't hesitate to say so! smile.gif

~Artik.


Artikbot is offline  
post #3 of 85 (permalink) Old 01-13-2013, 03:58 AM
 
Join Date: Aug 2011
Location: India, Calcutta
Posts: 7,797
How different would be the performance between the A6-5300 and the A6-5400K at stock settings in overall performance. My gf will be updating her rig and shes is a very casual gamer and it would be at a low resolution, maybe 1366x768 at the max.
jason387 is offline  
Sponsored Links
Advertisement
 
post #4 of 85 (permalink) Old 01-13-2013, 04:19 AM
I like computers
 
Aaron_Henderson's Avatar
 
Join Date: Mar 2007
Posts: 7,546
Rep: 309 (Unique: 251)
Awesome read, and it's not an A10, but I talked my GF into getting an A8 (7640G, I think), and I can play most games at descent settings. I was playing NFS Shift yesterday maxed out (only 2x AA and 4x AF though) at her laptops native res of 1366x768 with zero stuttering, and great frames. I didn't use FRAPS, but I was a good ways above 30 FPS, probably in the 45-70 FPS area. Kickass for a sub-$500 laptop if you ask me...I want one now too lol

* Dell T3500 motherboard - Xeon W3570 3.86-4.0Ghz - 10GB RAM - Gigabyte R9 290X Windforce 4GB - Corsair Carbide Spec-01 - Corsair CX600 - CoolIt ECO 120 AIO - Corsair 120GB SSD - 1TB HDD - Corsair Katar - QCK+ Limited - Ajazz AK33 - Acer H233H 71Hz 1080P 75Hz 720P - Kinter MA-170 - Yamaha 6" monitors

* HP Z400 motherboard - Xeon W3565 3.45GHz - 8GB RAM - Gigabyte GTX 770 Windforce 2GB - DeepCool Dukase White - Corsair VS450 - DeepCool Gammaxx 300 - 256GB SSD - 320GB HDD
Aaron_Henderson is offline  
post #5 of 85 (permalink) Old 01-13-2013, 05:56 AM - Thread Starter
New to Overclock.net
 
Artikbot's Avatar
 
Join Date: Jul 2010
Location: Catalunya
Posts: 14,959
Rep: 611 (Unique: 407)
Quote:
Originally Posted by Aaron_Henderson View Post

Awesome read, and it's not an A10, but I talked my GF into getting an A8 (7640G, I think), and I can play most games at descent settings. I was playing NFS Shift yesterday maxed out (only 2x AA and 4x AF though) at her laptops native res of 1366x768 with zero stuttering, and great frames. I didn't use FRAPS, but I was a good ways above 30 FPS, probably in the 45-70 FPS area. Kickass for a sub-$500 laptop if you ask me...I want one now too lol

Thanks! smile.gif

Yup they're some little beasts! thumb.gif And as I'm experiencing, they are bandwidth starved at higher resolutions (I'm at 1920x1080 mind you). As you start kicking up the memory speeds and lowering timings, the framerates start to go up and stuttering slowly disappears; but it shouldn't stutter a lot, if any, at your resolution thumb.gif
Quote:
Originally Posted by jason387 View Post

How different would be the performance between the A6-5300 and the A6-5400K at stock settings in overall performance. My gf will be updating her rig and shes is a very casual gamer and it would be at a low resolution, maybe 1366x768 at the max.

Not much, take a 5-7%. For the price difference I'd say grab the A8, it has a more powerful IGP, but the biggest difference is the dual module design, it really boosts gaming performance by quite a lot.


Stay tuned, I'm halfway done tongue.gif


Artikbot is offline  
post #6 of 85 (permalink) Old 01-13-2013, 06:19 AM
 
Join Date: Aug 2011
Location: India, Calcutta
Posts: 7,797
Nice review smile.gif. I think the A8 cpus are almost double the price of the A6-5300.
jason387 is offline  
post #7 of 85 (permalink) Old 01-13-2013, 08:20 AM - Thread Starter
New to Overclock.net
 
Artikbot's Avatar
 
Join Date: Jul 2010
Location: Catalunya
Posts: 14,959
Rep: 611 (Unique: 407)
Quote:
Originally Posted by jason387 View Post

Nice review smile.gif. I think the A8 cpus are almost double the price of the A6-5300.

Lol yes, they're 30 dollars more. But I can tell you from here that the performance increase is well worth it.

Now, if you want to wait for Kaveri and meanwhile hold with a low end A6, that's another story smile.gif


P.S: Overclocked benchmarks updated and finished! thumb.gif


Artikbot is offline  
post #8 of 85 (permalink) Old 01-13-2013, 08:35 AM
 
Join Date: Aug 2011
Location: India, Calcutta
Posts: 7,797
The benches look good smile.gif. Yeah and my gf hardly plays games. Maybe plants vs zombies. By the way if you haven't you should download that game. It's really addictive smile.gif.
jason387 is offline  
post #10 of 85 (permalink) Old 01-18-2013, 11:59 AM
New to Overclock.net
 
Hot Wirez's Avatar
 
Join Date: May 2011
Location: Midwest USA
Posts: 146
Rep: 8 (Unique: 8)
Wouldn't there be a noticeable performance boost if you used a 7200 HDD with 32mb cache and maybe a separate HDD for games?

Lowering resolution would help too.. Just sayin' for more thorough benches.
That's what I'd do anyhoo. I wouldn't be expecting miracles from the stand alone APUs GPU at 1080p.

Well done though! thumb.gif

Hot Wirez is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off