Originally Posted by
brettjv
And lemme give ya another generality when it comes to microstutter ... the HIGHER your multi-GPU 'scaling', the MORE likely the setup is to exhibit microstutter.
Optimal 'scaling' for minimization of microstutter is in the 75-90% range. Once you get up into the realm of 93%+ scaling, what that means is that the driver is set up (on that particular test/game) to make no attempt to 'evenly distribute' the frametimes.
IOW, the driver is eschewing any sort of metering (or 'making one card wait for the other occasionally, for the sake of smoothness') in favor of raw FPS. They do this because it helps to sell cards ... but it doesn't make the gaming experience better.
55FPS with no microstutter will be massively better 'feeling' than 60FPS with microstutter. However, the card makers (AMD in particular is especially guilty of this) believe that the public (as well as the online reviewers) are so consumed with FPS and scaling numbers, and so ignorant of microstutter, that it's in their best interest to setup their drivers to maximize FPS and Scaling, and not worry about the 'cost' on the actual gameplay experience.
The main reason microstutter has not been eliminated is that it's contrary to the gfx card makers interests ... because removing it means a) more expensive and elaborate driver code, and b) lower scaling and FPS numbers.