I don't think that AMD is the only company at fault here, but Bethesda too. Most of their games aren't exactly what you would call optimized, and microstutter is evident in most of their titles too (Gamebryo Engine :shudders:). While the Havoc Engine (Skyrim) does diminish microstutter versus the previous engine, as well as upping the ante in the grapical department, the main issue is, more or less, still there.
For the past couple generations, AMD has had cards that usually have more microstutter than NV cards. We all know about the microstutter issues with the 6xx0 series, especially in multi-gpu setups, but NV is usually the one with more consistent frame-times.
Is it an architectural problem? Maybe. GCN is, after all, an architecture made from the ground up to challenge NV in the compute department, as well as adding some more features and performance. Since the architecture is new, there's bound to be some problems with it. You can patch the issues with some driver updates, but the underlying cause is still hardware. Well, at least in the rendering department it is. GCN isn't the only architecture to have this problem either. I'm sure most of you guys know that the 6xx0 series has their share of microstutter problems, and even then, NV's Fermi, on average, had more consistent frame-times in both single and multi-gpu setups.
Is it a driver problem? Maybe. Some people are starting to think that the frame rate increases with each AMD driver update that calls for performance increases are doing it at the cost of increasing microstutter, but I still haven't seen proof that driver updates are the problem. Maybe someone on here can do that for us
I don't know the general consensus on the stability of AMD and NV drivers, but I'll just say that both sides have their ups and downs, and I don't think that it would be appropriate to say that AMD drivers are worse than NV drivers and vice-versa because there's simply too many variables to take into consideration to determine that.
Is it a game problem? Definitely. I've said it before. Skyrim is a buggy game, and the combination of this game with an AMD card is basically a double whammy. Now before you guys start firing your fingers off at me, just think about it. I'm not trying to put down a company, or favor one company over the other, but this problem isn't something that has just started occurring. Microstutter has been in games since the first game ever made, but is hasn't been until sometime last year that we've started to seriously take into consideration on how it affects gameplay.
Can you fix it? Yes, to an extent. You can use v-sync+triple buffering to help decrease microstutter. You can use the fps limiter
to dramatically reduce input lag and decrease microstutter (trust me-it works). The limiter works on any game too. Just configure the cfg file to your liking, and drop both the cfg and the dll files in the root directory of the game you're playing. You can also do what the author of the article did and set the affinity to 1. The combination of all of these things should, in theory, give you the most fluid gameplay experience. I'm also thinking that setting the affinity to 1 on AMD processors might not give very good microstutter reduction results as Intel processors because of their lower per-core performance.
I'm sure for some of you guys that play Skyrim, you have that problem when you move your mouse around and you get that jumpy, uneven movement, even though your frames are pretty high. Even with my old 5770, I get the problem when I play on low settings on vanilla, and it's pretty much unplayable when using S.T.E.P (well, that's a given because I need a better card for that). I believe that the general consensus is that both NV and AMD have their share microstutter issues, but AMD usually takes the larger hit. Hopefully, the cards coming out this year will reduce microstutter even more.Edited by airisom2 - 1/12/13 at 9:17am