I've decided to remove this from the fail thread that it was on.
Here's my dissertation about the topic of microstutter, if anyone is interested
What Does It Look Like?
Microstutter manifests in the following way: you know how when sometimes while gaming you'll hit a graphically difficult section, and you can just 'feel' that your fps has dropped dramatically, and you look at your FPS meter and discover that, indeed, you've dropped from 60fps to 25fps?
You know the sensation ... you get that sort of 'stuck in molasses' feel, everything gets laggy: your movement, changing your view, everything you do no longer produces an instant response?
Well, when you have microstutter, you get that effect, even though the FPS on the meter shows a high-enough FPS that you 'shouldn't' be getting that sensation.
For example, 30fps (w/o microstutter) on the large majority games generally feels pretty smooth, it's totally 'playable'. Xbox runs at 30fps, and it usually feels quite responsive, right?
Well, on a game that's exhibiting microstutter, 30fps can end up feeling like around 20fps. And at 20fps, everything feels all laggy and slide-showy.
Behind the Scenes
On a setup without microstutter, lets say you analyzed the actual performance during a given second of gaming when your FPS meter is showing your game at 30fps.
And by 'analyze', I mean you study the 'time to render' for each of the 30 frames that were rendered over the course of that second (I'll refer to these values as 'frametimes' moving forward, although that's not the 100% accurate definition of the term).
In this analysis, you'd find that the large majority of frametimes would be 'huddled around' the value of .033 seconds, or 33 milliseconds.
(note that I'm saying here that an alternate 'unit of measure' for expressing 30 FPS is 33ms per frame, because 1/30 = .033 seconds = 33 milliseconds ... does that all make sense?)
Now, there'd be some frametimes at other other values besides 33ms of course, as this is an average over a whole second, but if you graphed out the count (Y= Value Count, X = Frametime) of each value of frametimes for those 30 frames, you'd see a normally-distributed (bell-shaped) curve huddled around the mean value of 33ms. You'd get a 'single-humped camel back' looking graph.
However, when microstutter is affecting you, what you'd see instead is a consistent, cyclical variation in the 'time to render' of each frame. Frame 1 = 25 ms (40fps), frame 2 = 50ms (20fps), Frame 3 = 25ms, Frame 4 = 50ms, etc, over and over again during the course of that second.
Thus, if you plotted out the counts of each frametime value, instead of the 'single-humped camel back' graph you'd get w/o microstutter, you get a 'double-humped camel back' graph when you have microstutter. The count of each of the frametime values would be clustered around two separate and distinct mean values.
You lost me with all those numbers and such ... can you simplify it?
To put it even more simply, the two scenarios can be summed up like this:
without MS, ALL your frames are actually being rendered at right around 30fps (33ms/frame), whereas
with MS, your frames are constantly alternating back and forth between 20fps (50ms/frame) and 40fps (25ms/frame).
Perception vs. Reality (or: Some Good News at Last!)
In both cases (with or without MS), your AVERAGE over time is still calculated by FRAPS as being 30fps, but your PERCEPTION of how the scene looks is very different.
In the 'microstuttering condition', you 'sense' the scene as running at the slower of the two recurring values, in this case, 20fps. Not only that, but your brain can pick up on a dramatic, rapid fluctuation in FPS like a constant flipping from 20 to 40fps (repeatedly halving/doubling), and it produces a vague feeling of discomfort when you view a scene that's doing that.
I Thought You Said 'Good News'?
Now, fortunately, a players PERCEPTION of microstutter
should (meaning: it works this way for most people, but not all) be alleviated at higher framerates. This is because the DELTA (40-20 = 20 fps) in the two alternating framerates in a microstutter condition should
roughly stay the same when fps goes up.
So if you lower the settings on the same game, such that you're running at 60fps, you will be looking at half your frames at 50fps, and half your frames at 70fps.
So, while you still technically HAVE the microstutter at 60fps, if we extrapolate from the earlier analysis, we conclude that we'll perceive that 60fps average as though it were 50fps ... MUCH more playable than 20fps ... plus it's much harder for your brain to pick up on a rapid fluctuation between 50fps and 70fps so most people won't get that feeling of discomfort I mentioned above.
So, bottom-line MS will
generally become
imperceptible (NOT 'immeasurable') as FPS gets up into the FPS range where we normally like to play games (i.e. 60fps).
And the good news is that HAVING SLI puts you exactly in the right position to actually be ABLE to get those framerates up into the 60fps range, right?
Some Caveats/Further Explanation (if you're a total glutton for details)
In the interest of accuracy, I should point out that it's by no means impossible for someone to be able to 'perceive' that a game is exhibiting a pattern of microstutter at 60fps or even at much higher FPS rates. Anytime that frametimes are in constant fluctuation/uneven, there is a chance that it will be 'noticeable' ... but it seems to be the case that only a small % of people are sensitive to it and even less are actually bothered by it.
From what I've read re: microstutter over the years, it seems as though any time anyone makes any broad generalizations about 'what causes' microstutter or 'when it's going to happen' or 'when it will be noticeable', there are disagreements amongst people who are well-versed about the subject.
But besides the generalization I shared above, a couple other ideas that *seem* to be fairly-widely accepted (but NOT 100%, because NOTHING is when it comes to this subject) are these:
1) The closer your cards are to being 'maxed out' in terms of their capabilities in a given scenario, the more likely it becomes that microstutter will become noticeable.
2) This also means that effective framerate limiters (including v-sync) can definitely be helpful in reducing the likelihood of visible microstutter, because if they are 'kicking in', by definition that means that you are reserving additional gpu headroom and not maxing out the cards.
3) In a somewhat-related vein, it turns out that having a CPU that is wildly overpowered in comparison to the GPU's you have in SLI, the more likely it will be that you will experience noticeable microstutter, esp. if you don't use a framerate limiter of some kind.
And in a somewhat amusing twist, being CPU-bottlenecked actually helps reduce microstutter, as it's another type of framerate limiter that keeps the cards from getting 'tapped out'
4) The proper contents of the SLI profile for a given game, in particular the 'compatibility bits' property, are critical to reducing the effects of microstutter. So if you're using SLI profiles not specifically provided by nVidia to enable SLI on a given game (such as the various profile/game renaming or re-associating tricks, or making your own custom profiles with nVInspector, etc), the chances that you'll get visible microstutter will go up.
5) It's possible that certain 'scenes' or 'happenings' in games like explosions or complex scenes that cause the cards to suddenly struggle could cause microstutter to become noticeable.
This is for two reasons: One is because of the first reason I stated above. The Second is that, interestingly, the driver itself (partly by means of the 'compatibility bits' property of the SLI profile mentioned in 4) above) actually attempts to ameliorate the effects of microstutter. In order to 'time' the presentation of frames more consistently, it relies on predictive algorithms based on a small sample of the frametimes that immediately preceded the frames it's presenting 'now'.
But when FPS suddenly drops, the rearward-looking samples it's keeping become 'unrepresentative', so the driver loses it's ability to predict how long it should 'wait' to present the next frame in order to maintain consistent frametimes. This will be the case for a short time, until the current batch of 'past-tense' samples return to being more representative/predictive of the future-tense (if that makes sense ... sorry if that's technical and boring, but I found it interesting to find out
)
However, because things like big explosions also inherently involve framerate drops as well, it's hard to judge how much of 'the effect' one perceives is MS, and what's just the framerate dropping.
Final Notes, and What To Do if You Have It:
The whole problem, it's simple: Its only about the game ... and all your settings in the game ... and where you are in the FPS scale ... and the driver ... and the SLI profile settings ... and your gpus ... and your cpu ...
basically it's just the way EVERYTHING involved in the gaming experience in SLI comes together. There are almost no things that are 'consistent' with this phenomenon, and the issues involved with it share traits with many OTHER issues ... so really, you're almost better off not even worrying about trying to figure it all out.
If you're having the symptoms of MS with a game in SLI, and only in SLI ... try v-sync, OC'ing your gfx card, and lowering settings. If the game has a frame-rate limiter built-in, perhaps through a config file or console command, try using it.
And if those don't work ... you can just turn off SLI for that game ... OR you try out SLI AA, which disables regular SLI and let's both cards focus on producing a crazy level of AA ... and produces NO microstutter. Plus you can set SLI AA in a gaming profile so you don't have to manually turn SLI on and off when you play games that you don't want to run in SLI due to microstutter (or due to the lack of SLI support for the game).
And remember that in the majority of games, 30FPS on single card (or with SLI AA) usually feels/looks about as smooth as 45 on a SLI setup does anyways
If you read this far, thanks for reading, and I hope you gathered something you didn't know before
And feel free to use this post as a reference for people who ask you about Microstutter in the Future ... esp. if they're not really easily bored