Overclock.net banner

brettjv's Microstutter General Information Thread

3583 Views 36 Replies 16 Participants Last post by  mxthunder
8
I've decided to remove this from the fail thread that it was on.

Here's my dissertation about the topic of microstutter, if anyone is interested
biggrin.gif


What Does It Look Like?
Microstutter manifests in the following way: you know how when sometimes while gaming you'll hit a graphically difficult section, and you can just 'feel' that your fps has dropped dramatically, and you look at your FPS meter and discover that, indeed, you've dropped from 60fps to 25fps?

You know the sensation ... you get that sort of 'stuck in molasses' feel, everything gets laggy: your movement, changing your view, everything you do no longer produces an instant response?

Well, when you have microstutter, you get that effect, even though the FPS on the meter shows a high-enough FPS that you 'shouldn't' be getting that sensation.

For example, 30fps (w/o microstutter) on the large majority games generally feels pretty smooth, it's totally 'playable'. Xbox runs at 30fps, and it usually feels quite responsive, right?

Well, on a game that's exhibiting microstutter, 30fps can end up feeling like around 20fps. And at 20fps, everything feels all laggy and slide-showy.

Behind the Scenes
On a setup without microstutter, lets say you analyzed the actual performance during a given second of gaming when your FPS meter is showing your game at 30fps.

And by 'analyze', I mean you study the 'time to render' for each of the 30 frames that were rendered over the course of that second (I'll refer to these values as 'frametimes' moving forward, although that's not the 100% accurate definition of the term).

In this analysis, you'd find that the large majority of frametimes would be 'huddled around' the value of .033 seconds, or 33 milliseconds.

(note that I'm saying here that an alternate 'unit of measure' for expressing 30 FPS is 33ms per frame, because 1/30 = .033 seconds = 33 milliseconds ... does that all make sense?)

Now, there'd be some frametimes at other other values besides 33ms of course, as this is an average over a whole second, but if you graphed out the count (Y= Value Count, X = Frametime) of each value of frametimes for those 30 frames, you'd see a normally-distributed (bell-shaped) curve huddled around the mean value of 33ms. You'd get a 'single-humped camel back' looking graph.

However, when microstutter is affecting you, what you'd see instead is a consistent, cyclical variation in the 'time to render' of each frame. Frame 1 = 25 ms (40fps), frame 2 = 50ms (20fps), Frame 3 = 25ms, Frame 4 = 50ms, etc, over and over again during the course of that second.

Thus, if you plotted out the counts of each frametime value, instead of the 'single-humped camel back' graph you'd get w/o microstutter, you get a 'double-humped camel back' graph when you have microstutter. The count of each of the frametime values would be clustered around two separate and distinct mean values.

You lost me with all those numbers and such ... can you simplify it?
To put it even more simply, the two scenarios can be summed up like this: without MS, ALL your frames are actually being rendered at right around 30fps (33ms/frame), whereas with MS, your frames are constantly alternating back and forth between 20fps (50ms/frame) and 40fps (25ms/frame).

Perception vs. Reality (or: Some Good News at Last!)
In both cases (with or without MS), your AVERAGE over time is still calculated by FRAPS as being 30fps, but your PERCEPTION of how the scene looks is very different.

In the 'microstuttering condition', you 'sense' the scene as running at the slower of the two recurring values, in this case, 20fps. Not only that, but your brain can pick up on a dramatic, rapid fluctuation in FPS like a constant flipping from 20 to 40fps (repeatedly halving/doubling), and it produces a vague feeling of discomfort when you view a scene that's doing that.

I Thought You Said 'Good News'?
Now, fortunately, a players PERCEPTION of microstutter should (meaning: it works this way for most people, but not all) be alleviated at higher framerates. This is because the DELTA (40-20 = 20 fps) in the two alternating framerates in a microstutter condition should roughly stay the same when fps goes up.

So if you lower the settings on the same game, such that you're running at 60fps, you will be looking at half your frames at 50fps, and half your frames at 70fps.

So, while you still technically HAVE the microstutter at 60fps, if we extrapolate from the earlier analysis, we conclude that we'll perceive that 60fps average as though it were 50fps ... MUCH more playable than 20fps ... plus it's much harder for your brain to pick up on a rapid fluctuation between 50fps and 70fps so most people won't get that feeling of discomfort I mentioned above.

So, bottom-line MS will generally become imperceptible (NOT 'immeasurable') as FPS gets up into the FPS range where we normally like to play games (i.e. 60fps).

And the good news is that HAVING SLI puts you exactly in the right position to actually be ABLE to get those framerates up into the 60fps range, right?
thumb.gif


Some Caveats/Further Explanation (if you're a total glutton for details)
In the interest of accuracy, I should point out that it's by no means impossible for someone to be able to 'perceive' that a game is exhibiting a pattern of microstutter at 60fps or even at much higher FPS rates. Anytime that frametimes are in constant fluctuation/uneven, there is a chance that it will be 'noticeable' ... but it seems to be the case that only a small % of people are sensitive to it and even less are actually bothered by it.

From what I've read re: microstutter over the years, it seems as though any time anyone makes any broad generalizations about 'what causes' microstutter or 'when it's going to happen' or 'when it will be noticeable', there are disagreements amongst people who are well-versed about the subject.

But besides the generalization I shared above, a couple other ideas that *seem* to be fairly-widely accepted (but NOT 100%, because NOTHING is when it comes to this subject) are these:

1) The closer your cards are to being 'maxed out' in terms of their capabilities in a given scenario, the more likely it becomes that microstutter will become noticeable.

2) This also means that effective framerate limiters (including v-sync) can definitely be helpful in reducing the likelihood of visible microstutter, because if they are 'kicking in', by definition that means that you are reserving additional gpu headroom and not maxing out the cards.

3) In a somewhat-related vein, it turns out that having a CPU that is wildly overpowered in comparison to the GPU's you have in SLI, the more likely it will be that you will experience noticeable microstutter, esp. if you don't use a framerate limiter of some kind.

And in a somewhat amusing twist, being CPU-bottlenecked actually helps reduce microstutter, as it's another type of framerate limiter that keeps the cards from getting 'tapped out'
cool.gif


4) The proper contents of the SLI profile for a given game, in particular the 'compatibility bits' property, are critical to reducing the effects of microstutter. So if you're using SLI profiles not specifically provided by nVidia to enable SLI on a given game (such as the various profile/game renaming or re-associating tricks, or making your own custom profiles with nVInspector, etc), the chances that you'll get visible microstutter will go up.

5) It's possible that certain 'scenes' or 'happenings' in games like explosions or complex scenes that cause the cards to suddenly struggle could cause microstutter to become noticeable.

This is for two reasons: One is because of the first reason I stated above. The Second is that, interestingly, the driver itself (partly by means of the 'compatibility bits' property of the SLI profile mentioned in 4) above) actually attempts to ameliorate the effects of microstutter. In order to 'time' the presentation of frames more consistently, it relies on predictive algorithms based on a small sample of the frametimes that immediately preceded the frames it's presenting 'now'.

But when FPS suddenly drops, the rearward-looking samples it's keeping become 'unrepresentative', so the driver loses it's ability to predict how long it should 'wait' to present the next frame in order to maintain consistent frametimes. This will be the case for a short time, until the current batch of 'past-tense' samples return to being more representative/predictive of the future-tense (if that makes sense ... sorry if that's technical and boring, but I found it interesting to find out
eek.gif
)

However, because things like big explosions also inherently involve framerate drops as well, it's hard to judge how much of 'the effect' one perceives is MS, and what's just the framerate dropping.

Final Notes, and What To Do if You Have It:
The whole problem, it's simple: Its only about the game ... and all your settings in the game ... and where you are in the FPS scale ... and the driver ... and the SLI profile settings ... and your gpus ... and your cpu ...
tongue.gif
basically it's just the way EVERYTHING involved in the gaming experience in SLI comes together. There are almost no things that are 'consistent' with this phenomenon, and the issues involved with it share traits with many OTHER issues ... so really, you're almost better off not even worrying about trying to figure it all out.

If you're having the symptoms of MS with a game in SLI, and only in SLI ... try v-sync, OC'ing your gfx card, and lowering settings. If the game has a frame-rate limiter built-in, perhaps through a config file or console command, try using it.

And if those don't work ... you can just turn off SLI for that game ... OR you try out SLI AA, which disables regular SLI and let's both cards focus on producing a crazy level of AA ... and produces NO microstutter. Plus you can set SLI AA in a gaming profile so you don't have to manually turn SLI on and off when you play games that you don't want to run in SLI due to microstutter (or due to the lack of SLI support for the game).

And remember that in the majority of games, 30FPS on single card (or with SLI AA) usually feels/looks about as smooth as 45 on a SLI setup does anyways
thumb.gif


If you read this far, thanks for reading, and I hope you gathered something you didn't know before
drum.gif


And feel free to use this post as a reference for people who ask you about Microstutter in the Future ... esp. if they're not really easily bored
teaching.gif
See less See more
  • Rep+
Reactions: 5
1 - 20 of 37 Posts
2
Good start!
thumb.gif
+rep

Actually I suggest you to do some editing and re-organize the materials. Right now it is like a wall of text and is really not very user friendly.
wink.gif


Add all those graphs and tables that you have made in the past on this subject and put everything in a nice orderly post. This could be used as a Sticky.

Also, I think you have written a (sort of) step-by-step guide (with numbers in Excel or exporting those numbers to generate a graph) for people to check for microstutter. Put that in here too.
See less See more
Quote:
Originally Posted by windfire;13119791
Good start!
thumb.gif
+rep

Actually I suggest you to do some editing and re-organize the materials. Right now it is like a wall of text and is really not very user friendly.
wink.gif


Add all those graphs and tables that you have made in the past on this subject and put everything in a nice orderly post. This could be used as a Sticky.

Also, I think you have written a (sort of) step-by-step guide (with numbers in Excel or exporting those numbers to generate a graph) for people to check for microstutter. Put that in here too.
Thanks WF. Consider it a work in progress. And it's not a Wall O' Text ... my paragraphs are all nice and short
wink.gif


I'll try to get some graphs in here, and an explanation of how to test for it.
See less See more
  • Rep+
Reactions: 1
Very good explanation! +rep
This sums up why I'm never going to go SLI/CF again unless I have the best single GPU of that generation.

I don't get why a cpu bottleneck is good for less microstutter, why would it create more headroom on the GPUs? Most CPUs today can put out a lot more than 60 fps in most games
Quote:
Originally Posted by Arni90;13119956
Very good explanation! +rep
This sums up why I'm never going to go SLI/CF again unless I have the best single GPU of that generation.

I don't get why a cpu bottleneck is good for less microstutter, why would it create more headroom on the GPUs? Most CPUs today can put out a lot more than 60 fps in most games
I'm not going to claim that I totally understand everything I've read, but the general idea I've gathered is that if you're at 30fps due to a framerate cap (with tons of extra headroom on the cards) then MS won't be very dramatic in effect. But if you're at 30fps because the cards are just tapped out, then it's really likely you'll see it.

So basically if you accept the above notion, then you'll understand why an in-game framerate limiter would help and hence you'll understand why a CPU bottleneck would help. Because they produce the same net effect.
+rep

Lets just hope a certain someone doesn't come and try to crap in this thread.
hm... that probably explains y i never have experience it...

since i always use Vsync...

and since i have a AMD CPU with 2x480s.... my CPU is also limiting my fps....

nice to know...

also i suspect it is also because i have only recently come into the SLI scheme...

and the drivers have got alot better over time...

+Rep brettjv...

as always... very informative and helpful
See less See more
No microstutter: Low standard deviation of time between frames displayed. The time between frames being displayed does not vary much (i.e. 33ms, 29ms, 31ms, 33ms, 32ms).

Microstutter: Higher standard deviation of time between frames. The time between frames being displayed vary a lot more (i.e. 20ms, 53ms, 35ms, 21ms, 55ms).

Our eyes perceives smoothness based on how consistent the rate of image is displayed.
5
Quote:


Originally Posted by Riou
View Post

No microstutter: Low standard deviation of time between frames displayed. The time between frames being displayed does not vary much (i.e. 33ms, 29ms, 31ms, 33ms, 32ms).

Microstutter: Higher standard deviation of time between frames. The time between frames being displayed vary a lot more (i.e. 20ms, 53ms, 35ms, 21ms, 55ms).

That is statistically exactly correct my friend


And although personally I'd express it as low/high standard deviations in the 'elapsed time to render each frame' ... I know we both mean the same thing


Also, a definition based on standard deviation alone doesn't express the inherently cyclical nature (the repeating short/long/short/long 'times to render') that characterize the phenomenon


Quote:


Originally Posted by Riou
View Post

Our eyes perceives smoothness based on how consistent the rate of image is displayed.

There's that problem, and then there's also the issue where we perceive framerate based on the lowest instantaneous framerates over a period of time. So if the scene constantly alternates between 25ms/frame (40fps) and 50ms/frame (20 fps), i.e. a general depiction of microstutter at 30fps, it will appear to us like it's running at 20fps.
See less See more
This (interesting) thing made me wonder, both my brother and I are intensively gaming there are some differences though.

He 26 y/o, generally better at FPS/RTS or where speed and hand/eye coordination is an issue and pretty sensitive to MS and in general lower than normal framerates.

I'm 34 y/o, generally better on RPG/TB games or where speed and hand/eye coordination's less important and worse in FPS/RTS and I am less sensitive to MS and in general lower than normal framerates.

Meaning, while I may find a certain resolution/framerate playable he can't play at the same settings on the same computer without lamenting MS and having his performances depleted seriously.

Now I wonder, why such different perceptions? Are they due to the 'Different Human Hardware' we are sporting?
2
Quote:
Originally Posted by Mafia2020;13128093
This (interesting) thing made me wonder, both my brother and I are intensively gaming there are some differences though.

He 26 y/o, generally better at FPS/RTS or where speed and hand/eye coordination is an issue and pretty sensitive to MS and in general lower than normal framerates.

I'm 34 y/o, generally better on RPG/TB games or where speed and hand/eye coordination's less important and worse in FPS/RTS and I am less sensitive to MS and in general lower than normal framerates.

Meaning, while I may find a certain resolution/framerate playable he can't play at the same settings on the same computer without lamenting MS and having his performances depleted seriously.

Now I wonder, why such different perceptions? Are they due to the 'Different Human Hardware' we are sporting?
Microstutter greatly influences the persons input due to hand eye coordination. First-Person Shooters are greatly affected because the more fluid motion there is, the better ones ability to line up those headshots is. You may have all the skill in the world, but microstutter is like moving a sensitive mouse over a gravel road. Similar effect happens to a lady as she is applying makeup in a bouncy car. In one case, that lined up headshot ends up being 3ft off as the "stutter" occurs, the other makes you look like a clown. I will leave you to figure out which is which
smile.gif


I would also wager, that it also may have to do with being a decade apart in age. You are beginning to slow down in your responses, even if its mentally. Judging by you preferred gaming choices, I wouldn't hesitate that your tastes for slower gaming has increased, more strategy oriented games like Civilization 5, and more RPGs that are slow paced such as Baldur's Gate II.

I myself feel being pulled away from playing FPS heavily, and have moved on to some of the games mentioned above
smile.gif


@brettjv

I may not know some of the technical stuff in relation, but I do have some helpful advice to add to the subject matter. Also goes with trouble shooting as well.

Another way to avoid microstutter when introduced by multiple-gpus is planning ahead. Avoid buying different vendors cards, or cards at different times (spread out far enough that their is a good chance your card does not get manufactured anymore) or CFX/SLi referenced GPUs with non--referenced ones. I have personally experienced my share of fail setups last generation on AMDs side primarily due to mix matching (well poor drivers too.) No matter what anyone says about how you can CrossfireX / SLi x y and z cards, the most compatible combination are always the exact same identical cards.

Another helpful thing, is increasing/overclocking the inter-system communication such as AMD's CPU-NB frequency or Intel's Uncore frequency. Having these high and stable ensure that most of the bottleneck in your system goes right back to the performance capabilities of GPU. It does wonders for performance and greatly aids in reducing any bottleneck-style microstutter.

Furthermore, in multi-GPU setups greater than 2 (from the results I have seen), is an area where QPI Frequency and HT Link Frequency overclocking can actually show some noticeable improvements on minimum FPS and microstuttering. Don't expect miracles, and 9 times out of 10 can hold back CPU overclocking, but some few extreme systems utilizing 3 or 4 GPUs have noticed a much smoother experience while utilizing it, which in general, means reduced microstutter.

Seeing this brought up by me and a few others in threads countless times, having an overclock on your system that is not Prime95/IBT stable can causing microstutter. Some people have this misconception that if you are unstable with a overclock and you game, the only symptoms are BSOD or Crash-To-Desktops. This is infact incorrect. Much like Prime95 will only stop one core when you have error, a game will hiccup, and jump to another core, or the error will be Caught, and the Core continues, if its physically programmed that way. These new multi-core games are optimized especially if such an error occurs. Just think about the constant stream of data that is sent to your GPU to be rendered, and think about every tiny rounding error that can occur in the data. It is not any different than finding multiple errors in Prime95, you just get a different time of feedback. Primary reason for poor game performance when all other avenues have been exhausted, is a unstable overclock on either GPU/CPU/Mem.

In addition to the unstable Prime95 CPU testing being missed. These newer generation of cards have ECC memory onboard. I believe the 58xx AMD side and 5xx on the nVidia side all have GDDR5 ECC memory. Thats a good thing, however, every time you have an unstable overclock, you can see stuttering, or hiccups in performance primarily due, to the errors having to be fixed, instead of being 100% correct to begin with. All in all properly stress testing your overclocks is the biggest favor you or anybody else can do for your gaming.

Then there is this trick: the Full-Screen, Window Mode, Full-Screen trick. In some games, despite it appearing full screen, the GPU will render like its in a Window. Which is in general, utilizing less performance from your GPUS. Thus, despite it loading normal (Crysis was notorious for this) it is actually running half-assed. The quick trick is to either ALT+ENTER while a FPS counter is visible (similar to video fullscreen mode) but that doesn't always function in games. The alternative is to activate Window Mode (or disable Full Screen mode), then set it back to Full Screen mode. It may even require it a few times till performance is running Normal and full screen. Your best aid is a GPU usage monitor with a frame counter. MSI Afterburner's setup with On-Screen Display is probably the nicest one available and relatively easy to use, and FREE. My favorite.

Finally, about the last trick I have for handling multi-GPU microstutter and stutter in general is divided by manufacturer:

ATI -> Dealing with CrossfireX requires that sometimes you adjust Catalyst A.I from Standard to Advanced, and vice versa, and of course Disabling and Re-enabling CrossfireX.

nVidia -> Dealing with with SLI is a bit easier, its simply turning it on or off, to trouble shoot. But under advanced 3D settings, if you are on a Single monitor, you should have performance options on Single Monitor and Power Management: Prefer Maximum Performance, as opposed to Adaptive. The most overlooked features probably in any issue when dealing with nVidia, although why the default is Multi-monitor and Adaptive with one monitor plugged in is a bit silly. Also good results can be had by adjusting the preferred rendering mode, sometimes those Profiles for each game, is just fubared, and rather than waiting for weeks for an update, take a hands on approach and try different rendering modes such as AFR2 instead of AFR1 for example.

Learning about nVidiaInspector also greatly increases some of these rendering technique options you can change and can allow you to copy profiles you know work to similar games (i.e. Unreal Tournament 3 to other UT3 based games.) I am glad you touched up on the subject matter.

All in all, good thing you took the time to write it all up.

+Vote For Sticky
See less See more
Good thread brettjv. I've had far more than my fair share of micro stutter in the past on my old GTX470SLI setup. I am very sensitive to micro stutter. I even see if in movies and stuff! I have a keen eye for it! Some people are very sensitive to micro stutter such as me. Most people won't even notice it unless it's really bad.
SLI Discussion, Part Two

Can Microstutter Really Not Be Avoided?
On a fundamental level, a significant part of the reason that microstutter exists is because nVidia and AMD favor the Almighty FPS over the 'total gameplay experience'.

And this is, arguably, rightly so, because the FPS is by far the highest priority in the decision-making process when people choose video cards. No doubt this is at least in part due to this statistic being the major focus of all online graphics card reviews.

If this were NOT the case, it would be entirely possible for nV and AMD to add a selection to their control panels that would allow the user to choose to 'Favor Gameplay Smoothness', and thus force the output of the frames from their multi-GPU setups to adhere to (very close to) the same 'evenness of frame-presentation timing' that is seen on a single-GPU setup.

However, if they were to do this, reviewers would inevitably begin to USE this selector when doing comparisons, arguing (and rightly so) that the 'true' scaling should take into account 'gameplay smoothness'.

Unfortunately (for nV/AMD), the deployment such a 'smoothing' option, instead of enticing consumers with promises of e-peen enlargement and the highly-revered '100% scaling' (that sells people on 2nd graphics cards), would instead produce SLi Scaling that is more on the order of 20-40% ... not the kind of numbers that convince people that an additional +100% investment is truly one worth making.

So ... safe to say ... we aren't going to be getting any 'SLI - Favor Gameplay Smoothness' option any time soon, despite the fact that it's theoretically very do-able.

Is Great SLI Scaling Actually As Great as It Seems - I Wonder?
Allow me to introduce a theory of mine: while we revel in the thought of '100% scaling' and think of it like The Holy Grail, the more I've researched the issue of microstutter, the more I've become convinced that there's at least a good chance that the following is generally true:

The better the 'SLI Scaling' of any given game, the more likely it is that the frames are going to be unevenly synchronized in their display (i.e. the game will have measurable microstutter).

Now, as I mentioned before, getting the FPS up (such as SLI does) in the majority of cases will cause the 'perceived' framerate to more closely match the 'displayed' (by fraps) framerate. So, in a way, great SLI scaling will produce something along the lines of a 'wash'.

However, for SOME gamers that are sensitive to it, there is an inherent discomfort in viewing games where the instananeous framerate is in a constant state of cyclical fluctuation. Thus, I believe that such people would find that the best 'scalers' are actually the most unpleasant to look at on a multi-GPU setup.

Some Great Reward
So ... before I bore everyone to tears (again), I'm going to offer up a '1000 Internets' Reward to anyone whoever can come up with the best guess as to why I suspect that better scaling = more measureable microstutter.

Conversely, if someone can convince me of why the OPPOSITE should be true, or even why 'scaling' should have NOTHING to do with microstutter ... I will also proffer the aforementioned Reward
wink.gif
See less See more
Quote:


Originally Posted by brettjv
View Post

Some Great Reward
So ... before I bore everyone to tears (again), I'm going to offer up a '1000 Internets' Reward to anyone whoever can come up with the best guess as to why I suspect that better scaling = more measureable microstutter.

While, I am just guessing, I would anticipate that logic would explain this simply as a more complicated system requires creates a more complicated data path.

I would suppose a single GPU has the undivided attention of a single CPU, and essentially all of its peripheral bandwidths.

As it is now, adding a second GPU would cut into the CPU's attention to a certain degree. Even if it was just a single core, thats one less core the primary GPU has access to. That would I suppose just be scratching the surface of the problem. Since each core is apart of a family, sharing resources such as L2/L3 caches depending on which bank this particular core comes. We also have to load from the system RAM via the chipset, and of course the uncore frequency, all being loaded into a system from a harddrive.

However, since the system generally works fine with one GPU, so I would really say the limitations occur in the architecture of the system on a basic level, everything must go through a single device, the CPU. It is an inherent bottleneck, in its most basic form.

All of that aside, from an engineering stand point, I would anticipate that nVidia would know this, and would indeed know how to make a GPU far more efficient at coping with a system, but how would you market it? The PC user is so familiar with measuring performance by max FPS/avg FPS, that it might even be a hard sell. How do you measure something like this when its about feel? Perhaps more detailed frame measurement taking. Such a GPU may be designable, but its all possibly not cost effective for comparable performance from modern GPUs.

From a marketing stand point, the dual GPU setup is a money maker. Three and four GPUs are still very niche, and not many people will go right off the bat paying the high premium of 3 new generation GPUs. However, dual-GPUs is making a hell of lot of ground in the consumer/enthusiast market.

Quote:


That new high-end GPU is great and all, but two GPUs has gains of 100%!!!!! Games not performing as well as you expected? Throw in a another GPU!

Why make a single GPU that plays 40 fps well, when you can sell 2 at the same high premium price?
See less See more
  • Rep+
Reactions: 1
17
Quote:



Originally Posted by brettjv
View Post

I've decided to remove this from the fail thread that it was on.

Here's my dissertation about the topic of microstutter, if anyone is interested


What Does It Look Like?
Microstutter manifests in the following way: you know how when sometimes while gaming you'll hit a graphically difficult section, and you can just 'feel' that your fps has dropped dramatically, and you look at your FPS meter and discover that, indeed, you've dropped from 60fps to 25fps?

My micro stutter happened at highers speeds as it has for many on this forum and on google.
Quote:



Originally Posted by brettjv
View Post

You know the sensation ... you get that sort of 'stuck in molasses' feel, everything gets laggy: your movement, changing your view, everything you do no longer produces an instant response?

Well, when you have microstutter, you get that effect, even though the FPS on the meter shows a high-enough FPS that you 'shouldn't' be getting that sensation.

it depends on the game, some micro stutter just flickers, doesnt actually slow fps. its a mistiming of frames rendered. but doesnt necessarily mean a decrease in performance will result. Continue on.

Quote:



Originally Posted by brettjv
View Post


For example, 30fps (w/o microstutter) on the large majority games generally feels pretty smooth, it's totally 'playable'. Xbox runs at 30fps, and it usually feels quite responsive, right?

Well, on a game that's exhibiting microstutter, 30fps can end up feeling like around 20fps. And at 20fps, everything feels all laggy and slide-showy.

microstutter doesnt always feel slow. its a stutter, not a laggg or system hang.
Quote:



Originally Posted by brettjv
View Post

Behind the Scenes
On a setup without microstutter, lets say you analyzed the actual performance during a given second of gaming when your FPS meter is showing your game at 30fps.

And by 'analyze', I mean you study the 'time to render' for each of the 30 frames that were rendered over the course of that second (I'll refer to these values as 'frametimes' moving forward, although that's not the 100% accurate definition of the term).

In this analysis, you'd find that the large majority of frametimes would be 'huddled around' the value of .033 seconds, or 33 milliseconds.

(note that I'm saying here that an alternate 'unit of measure' for expressing 30 FPS is 33ms per frame, because 1/30 = .033 seconds = 33 milliseconds ... does that all make sense?)

Now, there'd be some frametimes at other other values besides 33ms of course, as this is an average over a whole second, but if you graphed out the count (Y= Value Count, X = Frametime) of each value of frametimes for those 30 frames, you'd see a normally-distributed (bell-shaped) curve huddled around the mean value of 33ms. You'd get a 'single-humped camel back' looking graph.

However, when microstutter is affecting you, what you'd see instead is a consistent, cyclical variation in the 'time to render' of each frame. Frame 1 = 25 ms (40fps), frame 2 = 50ms (20fps), Frame 3 = 25ms, Frame 4 = 50ms, etc, over and over again during the course of that second.

Thus, if you plotted out the counts of each frametime value, instead of the 'single-humped camel back' graph you'd get w/o microstutter, you get a 'double-humped camel back' graph when you have microstutter. The count of each of the frametime values would be clustered around two separate and distinct mean values.

You lost me with all those numbers and such ... can you simplify it?
To put it even more simply, the two scenarios can be summed up like this: without MS, ALL your frames are actually being rendered at right around 30fps (33ms/frame), whereas with MS, your frames are constantly alternating back and forth between 20fps (50ms/frame) and 40fps (25ms/frame).

Perception vs. Reality (or: Some Good News at Last!)
In both cases (with or without MS), your AVERAGE over time is still calculated by FRAPS as being 30fps, but your PERCEPTION of how the scene looks is very different.

In the 'microstuttering condition', you 'sense' the scene as running at the slower of the two recurring values, in this case, 20fps. Not only that, but your brain can pickup up on a dramatic, rapid fluctuation in FPS like a constant flipping from 20 to 40fps (repeatedly halving/doubling), and it produces a vague feeling of discomfort when you view a scene that's doing that.

Id love to see the sources for all that.

Yes fps can be measured in time. However where has it been shown the camel hump theory?
http://www.pcgameshardware.de/aid,63...fikkarte/Test/

Here the disclosed problem starts: multi-GPU systems often do not succeed in displaying regular distributed frames; it comes to large time intervals between individual images - "micro stuttering" - despite supposedly fluid frame rates. One example: frame 1 is followed by frame 2 after 10 milliseconds, which corresponds to a cadence of 100 Fps. Frame 3, however, is to be seen only 40 milliseconds afterwards, followed by frame 4 after 10 ms.

Your camel hump theory would suggest symmetrical timings. However the facts from this website show otherwise. Which is also more similar to the "microstutter" issue I was having. A sudden POP or BURST foward.

same here, its not a slow in fps, its a stutter...




This one is kind of both, looks low fps and a stutter.




Quote:



Originally Posted by brettjv
View Post

I Thought You Said 'Good News'?
Now, fortunately, a players PERCEPTION of microstutter should be alleviated at higher framerates. This is because the DELTA (40-20 = 20 fps) in the two alternating framerates in a microstutter condition should roughly stay the same when fps goes up.

(according to your camel hump theory)
Quote:



Originally Posted by brettjv
View Post


So if you lower the settings on the same game, such that you're running at 60fps, you will be looking at half your frames at 50fps, and half your frames at 70fps.

So, while you still technically HAVE the microstutter at 60fps, if we extrapolate from the earlier analysis, we conclude that we'll perceive that 60fps average as though it were 50fps ... MUCH more playable than 20fps ... plus it's much harder for your brain to pick up on a rapid fluctuation between 50fps and 70fps so you won't get that feeling of discomfort I mentioned above.

So, bottom-line MS will generally become imperceptible (NOT 'immeasurable') as FPS gets up into the FPS range where we normally like to play games (i.e. 60fps).

And the good news is that HAVING SLI puts you exactly in the right position to actually be ABLE to get those framerates up into the 60fps range, right?


Some Caveats/Further Explanation (if you're a total glutton for details)
In the interest of accuracy, I should point out that because the frametime fluctuations can be even larger than in my example above, it's by no means impossible to be able to 'see' microstutter at much higher FPS rates, because anytime that frametimes are in constant fluctuation/uneven, there is a chance that it will be 'noticeable', esp. to certain people who are sensitive to it.

im happy you added this since 3 days ago it was only at low fps...
Quote:



Originally Posted by brettjv
View Post

TBH, from what I've read re: microstutter over the years, it seems as though anytime one makes any broad generalizations about 'what causes' microstutter or 'when it's going to happen' or 'when it will be noticeable', there are disagreements amongst people who are well-versed about the subject.

But besides the generalization I shared above, a couple other ideas that *seem* to be fairly-widely accepted (but NOT 100%, because NOTHING is when it comes to this subject) are these:

1) The closer your cards are to being 'maxed out' in terms of their capabilities in a given scenario, the more likely it becomes that microstutter will become noticeable.

2) This also means that effective framerate limiters (including v-sync) can definitely be helpful in reducing the likelihood of visible microstutter, because if they are 'kicking in', by definition that means that you are reserving additional gpu headroom and not maxing out the cards.

3) In a somewhat-related vein, it turns out that having a CPU that is wildly overpowered in comparison to the GPU's you have in SLI, the more likely it will be that you will experience noticeable microstutter, esp. if you don't use a framerate limiter of some kind.

I explained I had micro stutter on a core i7 920 and dual 8800gt's and you said I didnt know what micro stutter was.. Now its more likely on a over powered cpu system? My point exactly. I'll save it for the end.
And in a somewhat amusing twist, being CPU-bottlenecked actually helps reduce microstutter, as it's another type of framerate limiter that keeps the cards from getting 'tapped out'


4) The proper contents of the SLI profile for a given game, in particular the 'compatibility bits' property, are critical to reducing the effects of microstutter. So if you're using SLI profiles not specifically provided by nVidia to enable SLI on a given game (such as the various profile/game renaming or re-associating tricks, or making your own custom profiles with nVInspector, etc), the chances that you'll get visible microstutter will go up.
[/qoute]
I found just the opposite to be true. Obviously if you are set to nvidia recommended and still experience "microstutter", try something else.

In America's Army 3, I was able to completely eliminate microstutter by changing it to alternate frame render 1. However, I was then getting 30fps on average compared to 40-60 fps using a single card, and 90-100fps in regular (microstutter) sli.

In theory you can test each game and different nvidia settings. Which is why I posted on a related thread.. .
Quote:



Originally Posted by brettjv
View Post

5) It's possible that certain 'scenes' or 'happenings' in games like explosions or complex scenes that cause the cards to suddenly struggle could cause microstutter to become noticeable.

This is for two reasons: One is because of the first reason I stated above. The Second is that, interestingly, the driver itself (partly by means of the 'compatibility bits' property of the SLI profile mentioned in 4) above) actually attempts to ameliorate the effects of microstutter. In order to 'time' the presentation of frames more consistently, it relies on predictive algorithms based on a small sample of the frametimes that immediately preceded the frames it's presenting 'now'.

But when FPS suddenly drops, the rearward-looking samples it's keeping become 'unrepresentative', so the driver loses it's ability to predict how long it should 'wait' to present the next frame in order to maintain consistent frametimes. This will be the case for a short time, until the current batch of 'past-tense' samples return to being more representative/predictive of the future-tense (if that makes sense ... sorry if that's technical and boring, but I found it interesting to find out
)

However, because things like big explosions also inherently involve framerate drops as well, it's hard to judge how much of 'the effect' one perceives is MS, and what's just the framerate dropping.

I found it to be certain textures. in TF2 it was indoors of mostly. On Mafia 2 it was the leather jacket. If you have micro stutter, put on the leather jacket in mafia 2 and watch it stuttttterrrr. especially very close to buildings.

Quote:



Originally Posted by brettjv
View Post

Final Notes, and What To Do if You Have It:
The whole problem, it's simple: Its only about the game ... and all your settings in the game ... and where you are in the FPS scale ... and the driver ... and the SLI profile settings ... and your gpus ... and your cpu ...
basically it's just the way EVERYTHING involved in the gaming experience in SLI comes together. There are almost no things that are 'consistent' with this phenomenon, and the issues involved with it share traits with many OTHER issues ... so really, you're almost better off not even worrying about trying to figure it all out.

If you're having the symptoms of MS with a game in SLI, and only in SLI ... try v-sync, OC'ing your gfx card, and lowering settings. If the game has a frame-rate limiter built-in, perhaps through a config file or console command, try using it.

And if those don't work ... you can just turn off SLI for that game ... OR you try out SLI AA, which disables regular SLI and uses the 2nd card just to process crazy levels of AA ... and produces NO microstutter. Plus you can set SLI AA in a gaming profile so you don't have to manually turn SLI on and off when you play games that you don't want to run in SLI.

And remember that in the majority of games, 30FPS on single card (or with SLI AA) usually feels/looks about as smooth as 45 on a SLI setup does anyways


If you read this far, thanks for reading, and I hope you gathered something you didn't know before


And feel free to use this post as a reference for people who ask you about Microstutter in the Future ... esp. if they're not really easily bored


Why not try the most common and successful solution? Updating/rolling back your drivers.

I updated my drivers and noticed a huge increase in microstutter, simply roll them back and wait for the next version.

Other solutions you can try is changing your frame rendering mode in NVCP as I mentioned earlier.

You should try turning on maximum performance mode in NVCP.
Try changing the Multi-display/mixed-GPU acceleration
Also Maximum pre-rendered frames.

I even tried to overclock my PCI Express bus speed a little.

I suggest monitor your cards temps, make sure they arent getting warm, causing errors or thermal throttling. This can be confused for microstuttering.

Also check system memory. If you are maxing your ram and using a lot of page file, make sure it is the correct size and hard drives defragmented. This also can be mistaken for microstuttering.

And first step in all trouble shooting is to disable possible conflicts. Goto start, msconfig, and then to the startup tab. Write down checked items, and disable all. Then goto the services tab, check the box that says, "hide all microsoft services" write down the checked items, and disable all. Restart and attempt to game.

PS, you may need to leave some nvidia (video card driver) items checked. Usually those are just for control panels though.

In conclusion, I have experienced micro stutter on a corei7 920 with 2x8800gt, which did NOT have a micro stutter issue on a socket 939, opteron 180 system(same 8800gts). This would confirm that the over powered cpu theory may be somewhat correct. Even when attempting to pre render 8 frames, it did not help though, even when fps dropped considerably.

I would also suggest it may be a memory timing issue with older cards in newer systems. I just really dont think my stock 8800gt memory can transfer data to my 1600mhz triple channel fast enough. ..just an idea.

My suggestion is to ask nvidia to come out with more rendering options in their NVCP. Maybe offset timings. I dont know.

Im staying single GPU til they figure it out. Im not wasting any more time or money.

+1 rep for exposing microstutter
See less See more
  • Rep+
Reactions: 1
2
Wow, you actually can behave like civilized contributor FD. Well done


First off ... the stutter in the Stalker video ... that is SO NOT microstutter. That's just plain 'stutter'. It happens in all three Stalker games, no matter what you do. It has to do with data loading as you move around. It's completely unfixable, and happens on any video card setup.

Just because people put up videos on youtube and say 'ZOMGlookattehMICROSTUTTERZ!!1!!' doesn't mean they know what the hell they're talking about. Microstutter is really not the kind of thing that shows up well in a video UNLESS you do a side-by-side video with the game on single card vs dual card, both at the same FPS, doing the same things.

And the difference between the two will be that the single card looks 'smoother' and less laggy. And even that is relatively hard to notice. MS is something that you almost *have* to actually feel by being the one playing the game.

The scenario I was describing with the 'two camel hump graph' would be what you'd see if you graphed out the COUNTS of the following sets of elapsed 'times to render', which is the most TYPICAL scenario characterizing microstutter:

10ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 13ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 10ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 13ms etc.

If you graph out the counts of these values (x: milliseconds to render, y: count of instances of x value), you'll find that you get two 'humps' in the graph, one centered around 10ms, and one around 30ms.

But you are correct if you're trying to point out that you could see 3 or even 4 'humps'. Here's an example of values that would provide 3 humps:

10ms, 20ms, 30ms, 9ms, 19ms, 29ms, 11ms, 21ms, 31ms, etc.

This would also be microstutter, but this is a less common pattern (esp. w/two cards in SLI), and not likely to be as noticeable as the first.

An example of a set of values that is NOT indicative of microstutter is this:
10ms, 10ms, 12ms, 30ms, 20ms, 20ms 25ms, 50ms, 10ms, 50ms, 49ms, 40ms, 600ms, etc.

There's always a PATTERN, in other words, not random fluctations like these. And the 600ms at the end ... that's what a 'pop' or 'stutter' looks like.

So, in the most basic sense, microstutter involves some recurring pattern of unevenness in terms of 'time to render' of frames over a given period of time. If these render times are graphed, it will not be a 'normal' bell-shaped distribution that centers around one average value like a single card would do.

Now, I'm not going to go back and relive that despicable thread wherein you told me to 'go back to (fellating) my dog', and charmingly implied that the other members who were supporting my viewpoint were '(fellating) Brett' nor am I going to continue the debate about what was said.

But I will say this: microstutter is NOT characterized by a 'popping' effect. We're talking about a phenomenon where the timeframes involved are milliseconds. We're talking about things that are happening way too fast for the eyes to see anything like a 'pop' going on.

You said:
Quote:


its a stutter, not a laggg

And I'm sorry, but you are incorrect. Microstutter absolutely DOES feel like lag, it is NOT stutter, or stuttering. It is *primarily* manifested by the perception that you are playing at a framerate that is much lower than the 'displayed' framerate. It's a feeling like being stuck in molasses, where you have a slow onscreen response between mouse movement and screen movement.

I think that you are convinced that you 'have' microstutter (which you certainly may at times) and therefore you're experiencing OTHER anomalies as being 'part and parcel' of the 'microstutter'. But there really aren't a whole bunch of different ways in which microstutter is manifested.

This being said, aside from MS feeling like lag, it's also possible that someone might describe it as the game 'not feeling smooth' even w/o describing it as 'laggy' ... and that likely would be in the scenario where the framerate is very high (but frametimes are still varying in a dramatic and cyclical way). Maybe someone else might describe that same effect as the game being 'flickery', as that's kind of the same idea.

But any effect that could remotely be described as a 'pop' is just regular 'stutter'.

Now, it's certainly not impossible for one to get 'stutters' during periods when you are ALSO getting microstutter, but if 'the problem' you're getting in a particular game is one you'd describe as 'popping' ... the problem is NOT properly described as microstutter. That's why it's called microstutter.

And that is why I told you that you don't know what microstutter is, NOT because you said you have it at 400fps.

AFA your other suggestions go, sure, practically ANYthing is worth a try. For example, there's no reason to NOT try changing between the render modes they expose for you to play with. It's certainly possible that nVidia chose the render mode to put in the profile based on pure performance, in order to make their cards look good in published benchmarks. Switching render modes might, in rare cases, reduce microstutter.

But trying to make your own SLI profile based on creating a new, empty game profile in NVCP and playing around with the SLI Render Mode is nearly certain to produce unfavorable results. Reason being, you cannot set the 'Compatibility Bits' property through NVCP ... yet this is THE CRITICAL property of the SLI profile. It's the primary method by which the driver is customized to the particular traits of the game in order to make SLI work properly, especially with regards to proper frame synching.

A SLI profile consisting of nothing but the choice of one particular render mode ... is really not a SLI profile at all, and 99% certain to work much worse (or not at all) vs. a SLI profile created by nVidia for the particular game. I would submit to you that if you are in the habit of trying to make your own ad-hoc SLI profiles to override what nV has made, that may well be why you find installing a different driver helps the situation ... you may be restoring SLI profiles back to defaults. Just an idea.

Also ... seriously, man ... if putting on the leather jacket in Mafia 2 causes lag ... that's a freaking bug in the game dude. A specific texture should NOT be something that 'causes' microstutter. But I'll tell you what ... I've pointed you to instructions how to empirically test for microstutter, why don't you try TESTING this game section, and seeing if, indeed, the frametime values are behaving in a manner consistent with microstutter?

Everyone has problems with Mafia 2, esp. lag when you're indoors ... low fps, AND low gpu usage. If you don't have a good quality (8800GT or above) DEDICATED physX card in Mafia 2 ... there's a lot of parts that just plain don't run well. The game is coded so that Cloth physX always runs on the CPU unless you have a dedicated physX card ... and there's parts of the game with a LOT of cloth to calculate ... some of it you can't even see, but it still gets calc'd.

Lastly, certainly, until such time as nV and AMD decide to at least make it possible for the user to CHOOSE to favor an even frametime distribution over pure performance when running in SLI, I would suggest to anyone that is sensitive to the effects of microstutter to steer clear of multi-GPU
See less See more
2
Quote:


Originally Posted by RagingCain
View Post

Microstutter greatly influences the persons input due to hand eye coordination. First-Person Shooters are greatly affected because the more fluid motion there is, the better ones ability to line up those headshots is. You may have all the skill in the world, but microstutter is like moving a sensitive mouse over a gravel road. Similar effect happens to a lady as she is applying makeup in a bouncy car. In one case, that lined up headshot ends up being 3ft off as the "stutter" occurs, the other makes you look like a clown. I will leave you to figure out which is which


I would also wager, that it also may have to do with being a decade apart in age. You are beginning to slow down in your responses, even if its mentally. Judging by you preferred gaming choices, I wouldn't hesitate that your tastes for slower gaming has increased, more strategy oriented games like Civilization 5, and more RPGs that are slow paced such as Baldur's Gate II.

Your thinking does make sense in a way, but I'd rather ascribe it to the fact my brother and I are from different generations, considered the facts I mentioned aren't something recent, as far as I can recollect it has ALWAYS been like that.

Something like my brother has a higher sensitivity (or the other way round and I had a lower sensitivity to MS. Now granted perception's different for everyone, why while playing/watching a game SOME will detect MS and be bothered by that and SOME OTHER will NOT detect MS unless it is blatant?
See less See more
2
Quote:


Originally Posted by Mafia2020
View Post

Something like my brother has a higher sensitivity (or the other way round and I had a lower sensitivity to MS. Now granted perception's different for everyone, why while playing/watching a game SOME will detect MS and be bothered by that and SOME OTHER will NOT detect MS unless it is blatant?

All I can say is that when microstutter is occurring at a LOW fps, it becomes obvious to anyone.

There is a definite, obvious sensation of 'lag', of the responsiveness of the game feeling significantly slower than what would be expected at the framerate that FRAPS is telling you that it is.

A STEADY 30fps SHOULD feel quite smooth. Not perfect, but not a lag-fest. But if you have microstutter, and in reality that 30fps is derived from a rapid fluctuation between (say) 10fps and 50fps on each an every frame (which is horribly bad MS), the game will look and feel laggy as all hell ... like it's running at 10fps. This sort of microstutter will be obvious to all.

However, when microstutter is occurring at a HIGH framerate, (and by occurring, I mean you can measure it empirically, and see the cyclical fluctuation in 'render times') ... that is the scenario where there's a lot of variability: some people are sensitive to it and experience a sense that something is 'off', that the game is 'not smooth' ... whereas it 'looks fine' to someone else.

It IS an interesting question though, what DOES make some people sensitive to the perception of high-fps microstutter, and some people ... not


Myself ... the only time I notice microstutter is when my FPS is low. I can TOTALLY tell that 30fps on my machine w/SLI disabled looks much smoother than 30fps on my rig in SLI, in the majority of games. But 60fps looks exactly the same to me. And this is how the majority of people are with regards to the whole thing.
See less See more
Hm.

On the idea that the better the SLI scaling, the worse the potential microstutter...

The closer scaling becomes to 100%, the closer it becomes to each card delivering equal quantities of frames, alternately... just as the name 'Alternate Frame Rendering' would imply.

And here is where, I suspect, latency comes and rears its ugly head. It's not a bandwidth issue, as that is more than covered by PCI-E 2.0 and the SLI bridge (for which I still can't find a solid figure I actually believe for its actual bandwidth impact).

Now, this is purely for a multiple-cards-feeds-to-one-screen scenario, as I haven't seen microstutter on Surround, but I'm fairly sure that latencies will have a dual impact; firstly, on getting the frames rendered by the second/third/fourth card to the first card for output to the monitor, and secondly on the timing control/crosstalk between the cards and the driver.

Lets take a best-case setup, where the SLI bridge does all the work:

Frame 1 is rendered by Card 1; it is sent to the monitor with minimal latency, except that exhibited by the process of constructing the frame in the GPU, and sending it off via the DVI port.

Frame 2 is rendered by Card 2; it is rendered on the second card, and piped through the SLI bridge to the first card, which has to recieve it, analyse where it should be slotted in amongst the frames it is already in the process of rendering, then it goes to the DVI output.

If, however, the first card has moved on past where that frame would render... what does the driver do? Does it just slot the frame in at the first available space? No, that would look terrible. So it just drops it, and relies on Card 1 to feed the monitor the next frame as fast as possible.

So you get a jerk in the fps as a frame vanishes into the ether. It is sub-second, so has no visible impact on reported framerate, but is evident if you're sensitive. I don't know whether it would even be reported as a dropped frame.

Now, obviously, that doesn't happen all the time. But in games that exhibit microstutter, it's a common enough occurance for it to be noticable to the viewer.

Now lets look at another potential case, and the one I favour, given that SLI does appear to work without a bridge... most of the time...

Frame 1 is rendered by Card 1; it is sent to the monitor with minimal latency, except that exhibited by the process of constructing the frame in the GPU, and sending it off via the DVI port.

Frame 2 is rendered by Card 2; it is constructed on the second card, and send via the PCI-E bus to the PCI-E controller (on many modern systems on the CPU) which analyses where it needs to be sent, and sends it on it's way to the first GPU, where it is analysed, slotted in to the queue appropriately, and rendered... or dropped.

Obviously, the second route is going to take a lot, lot longer... and greatly increases the chances that Card 1 will have moved on from where the frame actually needed to be.

Now, this plainly ignores the overheads involved of the two cards 'talking' - both to one another and the driver - to remain in synchronicity. And I would imagine that that is a lot of overhead. Which, given that the PCI-E bus has to do other stuff quite a bit too, and to avoid too much load on the CPU, I would imagine is carried out over the SLI bridge. Which would explain why SLI without a bridge works... but not quite so well (in my experience, I see lower framerates) as with a bridge. And starts to exhibit problems when you're talking about the more high-powered cards. When the cards talk to the driver, they do it via PCI-E bus. But to each other, once they know what they're doing? Over the SLI bridge, unless that is impossible.

...

Anyway, that mostly explains what I see in two-card-single-screen SLI (I've never tried three, but the situation should be similar... but worse). The driver tries to feed the second card only stuff that it has calculated won't need to be abandoned, as that is wasted time and effort. Which is why scaling isn't perfect in AFR - card 1 is rendering more frames than card 2. The closer scaling gets to perfect... the worse the potential becomes for dropped frames... particularly if the PCI-E bus has a lot of traffic, like with HDD access, audio, networking... all the stuff that happens when you game.

The way around this would be more tightly regulated communication between the cards - and what 3dfx used to do - Scan Line Interleaving; one card calculates the odd lines, one the even lines. This more evenly balances the load, but on modern high-power systems would have a fairly hefty latency impact, I'm thinking, to make sure they were always in sync.

...

Anyway, this is mostly just me thinking out loud. Or onto a keyboard...
See less See more
  • Rep+
Reactions: 1
1 - 20 of 37 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top