Overclock.net › Forums › Graphics Cards › NVIDIA › brettjv's Microstutter General Information Thread
New Posts  All Forums:Forum Nav:

brettjv's Microstutter General Information Thread - Page 2

post #11 of 37
Quote:
Originally Posted by Mafia2020 View Post
This (interesting) thing made me wonder, both my brother and I are intensively gaming there are some differences though.

He 26 y/o, generally better at FPS/RTS or where speed and hand/eye coordination is an issue and pretty sensitive to MS and in general lower than normal framerates.

I'm 34 y/o, generally better on RPG/TB games or where speed and hand/eye coordination's less important and worse in FPS/RTS and I am less sensitive to MS and in general lower than normal framerates.

Meaning, while I may find a certain resolution/framerate playable he can't play at the same settings on the same computer without lamenting MS and having his performances depleted seriously.

Now I wonder, why such different perceptions? Are they due to the 'Different Human Hardware' we are sporting?
Microstutter greatly influences the persons input due to hand eye coordination. First-Person Shooters are greatly affected because the more fluid motion there is, the better ones ability to line up those headshots is. You may have all the skill in the world, but microstutter is like moving a sensitive mouse over a gravel road. Similar effect happens to a lady as she is applying makeup in a bouncy car. In one case, that lined up headshot ends up being 3ft off as the "stutter" occurs, the other makes you look like a clown. I will leave you to figure out which is which

I would also wager, that it also may have to do with being a decade apart in age. You are beginning to slow down in your responses, even if its mentally. Judging by you preferred gaming choices, I wouldn't hesitate that your tastes for slower gaming has increased, more strategy oriented games like Civilization 5, and more RPGs that are slow paced such as Baldur's Gate II.

I myself feel being pulled away from playing FPS heavily, and have moved on to some of the games mentioned above

@brettjv

I may not know some of the technical stuff in relation, but I do have some helpful advice to add to the subject matter. Also goes with trouble shooting as well.

Another way to avoid microstutter when introduced by multiple-gpus is planning ahead. Avoid buying different vendors cards, or cards at different times (spread out far enough that their is a good chance your card does not get manufactured anymore) or CFX/SLi referenced GPUs with non--referenced ones. I have personally experienced my share of fail setups last generation on AMDs side primarily due to mix matching (well poor drivers too.) No matter what anyone says about how you can CrossfireX / SLi x y and z cards, the most compatible combination are always the exact same identical cards.

Another helpful thing, is increasing/overclocking the inter-system communication such as AMD's CPU-NB frequency or Intel's Uncore frequency. Having these high and stable ensure that most of the bottleneck in your system goes right back to the performance capabilities of GPU. It does wonders for performance and greatly aids in reducing any bottleneck-style microstutter.

Furthermore, in multi-GPU setups greater than 2 (from the results I have seen), is an area where QPI Frequency and HT Link Frequency overclocking can actually show some noticeable improvements on minimum FPS and microstuttering. Don't expect miracles, and 9 times out of 10 can hold back CPU overclocking, but some few extreme systems utilizing 3 or 4 GPUs have noticed a much smoother experience while utilizing it, which in general, means reduced microstutter.

Seeing this brought up by me and a few others in threads countless times, having an overclock on your system that is not Prime95/IBT stable can causing microstutter. Some people have this misconception that if you are unstable with a overclock and you game, the only symptoms are BSOD or Crash-To-Desktops. This is infact incorrect. Much like Prime95 will only stop one core when you have error, a game will hiccup, and jump to another core, or the error will be Caught, and the Core continues, if its physically programmed that way. These new multi-core games are optimized especially if such an error occurs. Just think about the constant stream of data that is sent to your GPU to be rendered, and think about every tiny rounding error that can occur in the data. It is not any different than finding multiple errors in Prime95, you just get a different time of feedback. Primary reason for poor game performance when all other avenues have been exhausted, is a unstable overclock on either GPU/CPU/Mem.

In addition to the unstable Prime95 CPU testing being missed. These newer generation of cards have ECC memory onboard. I believe the 58xx AMD side and 5xx on the nVidia side all have GDDR5 ECC memory. Thats a good thing, however, every time you have an unstable overclock, you can see stuttering, or hiccups in performance primarily due, to the errors having to be fixed, instead of being 100% correct to begin with. All in all properly stress testing your overclocks is the biggest favor you or anybody else can do for your gaming.

Then there is this trick: the Full-Screen, Window Mode, Full-Screen trick. In some games, despite it appearing full screen, the GPU will render like its in a Window. Which is in general, utilizing less performance from your GPUS. Thus, despite it loading normal (Crysis was notorious for this) it is actually running half-assed. The quick trick is to either ALT+ENTER while a FPS counter is visible (similar to video fullscreen mode) but that doesn't always function in games. The alternative is to activate Window Mode (or disable Full Screen mode), then set it back to Full Screen mode. It may even require it a few times till performance is running Normal and full screen. Your best aid is a GPU usage monitor with a frame counter. MSI Afterburner's setup with On-Screen Display is probably the nicest one available and relatively easy to use, and FREE. My favorite.

Finally, about the last trick I have for handling multi-GPU microstutter and stutter in general is divided by manufacturer:

ATI -> Dealing with CrossfireX requires that sometimes you adjust Catalyst A.I from Standard to Advanced, and vice versa, and of course Disabling and Re-enabling CrossfireX.

nVidia -> Dealing with with SLI is a bit easier, its simply turning it on or off, to trouble shoot. But under advanced 3D settings, if you are on a Single monitor, you should have performance options on Single Monitor and Power Management: Prefer Maximum Performance, as opposed to Adaptive. The most overlooked features probably in any issue when dealing with nVidia, although why the default is Multi-monitor and Adaptive with one monitor plugged in is a bit silly. Also good results can be had by adjusting the preferred rendering mode, sometimes those Profiles for each game, is just fubared, and rather than waiting for weeks for an update, take a hands on approach and try different rendering modes such as AFR2 instead of AFR1 for example.

Learning about nVidiaInspector also greatly increases some of these rendering technique options you can change and can allow you to copy profiles you know work to similar games (i.e. Unreal Tournament 3 to other UT3 based games.) I am glad you touched up on the subject matter.

All in all, good thing you took the time to write it all up.

+Vote For Sticky
Edited by RagingCain - 4/14/11 at 5:39pm
Snowdevil
(16 items)
 
ASUS G750JM
(9 items)
 
 
CPUMotherboardGraphicsGraphics
[i7 4790K @ 4.4 GHz (1.186v)] [Asus Sabertooth Z97 Mark S] [nVidia Geforce GTX 1080] [nVidia Geforce GTX 1080] 
RAMHard DriveCoolingOS
[G.Skill 32GB DDR3 2133 MHz] [Crucial MX100 256GB] [Phanteks PH-TC12DX] [Win 10.1 Pro] 
MonitorMonitorKeyboardPower
[LG 29UM65 (2560x1080)] [QNIX Evo II LED (2560x1440)] [WASD v2 Tenkeyless] [NZXT Hale90 v2 ] 
CaseMouseMouse PadAudio
[ThermalTake GT10 Snow Edition] [Razer Mamba - Chroma] [Razer Kabuto] [Razer Man O' War] 
CPUMotherboardGraphicsRAM
i7 4770HQ Intel HM87 Express Chipset Geforce GTX 860M 8GB DDR3L 1600 MHz 
Hard DriveOptical DriveCoolingOS
Samsung SSD EVO DVD-RW Stock Windows 8.1 
Monitor
1920x1080 TN 
  hide details  
Reply
Snowdevil
(16 items)
 
ASUS G750JM
(9 items)
 
 
CPUMotherboardGraphicsGraphics
[i7 4790K @ 4.4 GHz (1.186v)] [Asus Sabertooth Z97 Mark S] [nVidia Geforce GTX 1080] [nVidia Geforce GTX 1080] 
RAMHard DriveCoolingOS
[G.Skill 32GB DDR3 2133 MHz] [Crucial MX100 256GB] [Phanteks PH-TC12DX] [Win 10.1 Pro] 
MonitorMonitorKeyboardPower
[LG 29UM65 (2560x1080)] [QNIX Evo II LED (2560x1440)] [WASD v2 Tenkeyless] [NZXT Hale90 v2 ] 
CaseMouseMouse PadAudio
[ThermalTake GT10 Snow Edition] [Razer Mamba - Chroma] [Razer Kabuto] [Razer Man O' War] 
CPUMotherboardGraphicsRAM
i7 4770HQ Intel HM87 Express Chipset Geforce GTX 860M 8GB DDR3L 1600 MHz 
Hard DriveOptical DriveCoolingOS
Samsung SSD EVO DVD-RW Stock Windows 8.1 
Monitor
1920x1080 TN 
  hide details  
Reply
post #12 of 37
Good thread brettjv. I've had far more than my fair share of micro stutter in the past on my old GTX470SLI setup. I am very sensitive to micro stutter. I even see if in movies and stuff! I have a keen eye for it! Some people are very sensitive to micro stutter such as me. Most people won't even notice it unless it's really bad.
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
post #13 of 37
Thread Starter 
SLI Discussion, Part Two

Can Microstutter Really Not Be Avoided?
On a fundamental level, a significant part of the reason that microstutter exists is because nVidia and AMD favor the Almighty FPS over the 'total gameplay experience'.

And this is, arguably, rightly so, because the FPS is by far the highest priority in the decision-making process when people choose video cards. No doubt this is at least in part due to this statistic being the major focus of all online graphics card reviews.

If this were NOT the case, it would be entirely possible for nV and AMD to add a selection to their control panels that would allow the user to choose to 'Favor Gameplay Smoothness', and thus force the output of the frames from their multi-GPU setups to adhere to (very close to) the same 'evenness of frame-presentation timing' that is seen on a single-GPU setup.

However, if they were to do this, reviewers would inevitably begin to USE this selector when doing comparisons, arguing (and rightly so) that the 'true' scaling should take into account 'gameplay smoothness'.

Unfortunately (for nV/AMD), the deployment such a 'smoothing' option, instead of enticing consumers with promises of e-peen enlargement and the highly-revered '100% scaling' (that sells people on 2nd graphics cards), would instead produce SLi Scaling that is more on the order of 20-40% ... not the kind of numbers that convince people that an additional +100% investment is truly one worth making.

So ... safe to say ... we aren't going to be getting any 'SLI - Favor Gameplay Smoothness' option any time soon, despite the fact that it's theoretically very do-able.

Is Great SLI Scaling Actually As Great as It Seems - I Wonder?
Allow me to introduce a theory of mine: while we revel in the thought of '100% scaling' and think of it like The Holy Grail, the more I've researched the issue of microstutter, the more I've become convinced that there's at least a good chance that the following is generally true:

The better the 'SLI Scaling' of any given game, the more likely it is that the frames are going to be unevenly synchronized in their display (i.e. the game will have measurable microstutter).

Now, as I mentioned before, getting the FPS up (such as SLI does) in the majority of cases will cause the 'perceived' framerate to more closely match the 'displayed' (by fraps) framerate. So, in a way, great SLI scaling will produce something along the lines of a 'wash'.

However, for SOME gamers that are sensitive to it, there is an inherent discomfort in viewing games where the instananeous framerate is in a constant state of cyclical fluctuation. Thus, I believe that such people would find that the best 'scalers' are actually the most unpleasant to look at on a multi-GPU setup.

Some Great Reward
So ... before I bore everyone to tears (again), I'm going to offer up a '1000 Internets' Reward to anyone whoever can come up with the best guess as to why I suspect that better scaling = more measureable microstutter.

Conversely, if someone can convince me of why the OPPOSITE should be true, or even why 'scaling' should have NOTHING to do with microstutter ... I will also proffer the aforementioned Reward
Edited by brettjv - 4/14/11 at 7:28pm
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #14 of 37
Quote:
Originally Posted by brettjv View Post
Some Great Reward
So ... before I bore everyone to tears (again), I'm going to offer up a '1000 Internets' Reward to anyone whoever can come up with the best guess as to why I suspect that better scaling = more measureable microstutter.
While, I am just guessing, I would anticipate that logic would explain this simply as a more complicated system requires creates a more complicated data path.

I would suppose a single GPU has the undivided attention of a single CPU, and essentially all of its peripheral bandwidths.

As it is now, adding a second GPU would cut into the CPU's attention to a certain degree. Even if it was just a single core, thats one less core the primary GPU has access to. That would I suppose just be scratching the surface of the problem. Since each core is apart of a family, sharing resources such as L2/L3 caches depending on which bank this particular core comes. We also have to load from the system RAM via the chipset, and of course the uncore frequency, all being loaded into a system from a harddrive.

However, since the system generally works fine with one GPU, so I would really say the limitations occur in the architecture of the system on a basic level, everything must go through a single device, the CPU. It is an inherent bottleneck, in its most basic form.

All of that aside, from an engineering stand point, I would anticipate that nVidia would know this, and would indeed know how to make a GPU far more efficient at coping with a system, but how would you market it? The PC user is so familiar with measuring performance by max FPS/avg FPS, that it might even be a hard sell. How do you measure something like this when its about feel? Perhaps more detailed frame measurement taking. Such a GPU may be designable, but its all possibly not cost effective for comparable performance from modern GPUs.

From a marketing stand point, the dual GPU setup is a money maker. Three and four GPUs are still very niche, and not many people will go right off the bat paying the high premium of 3 new generation GPUs. However, dual-GPUs is making a hell of lot of ground in the consumer/enthusiast market.

Quote:
That new high-end GPU is great and all, but two GPUs has gains of 100%!!!!! Games not performing as well as you expected? Throw in a another GPU!
Why make a single GPU that plays 40 fps well, when you can sell 2 at the same high premium price?
Edited by RagingCain - 4/14/11 at 8:06pm
Snowdevil
(16 items)
 
ASUS G750JM
(9 items)
 
 
CPUMotherboardGraphicsGraphics
[i7 4790K @ 4.4 GHz (1.186v)] [Asus Sabertooth Z97 Mark S] [nVidia Geforce GTX 1080] [nVidia Geforce GTX 1080] 
RAMHard DriveCoolingOS
[G.Skill 32GB DDR3 2133 MHz] [Crucial MX100 256GB] [Phanteks PH-TC12DX] [Win 10.1 Pro] 
MonitorMonitorKeyboardPower
[LG 29UM65 (2560x1080)] [QNIX Evo II LED (2560x1440)] [WASD v2 Tenkeyless] [NZXT Hale90 v2 ] 
CaseMouseMouse PadAudio
[ThermalTake GT10 Snow Edition] [Razer Mamba - Chroma] [Razer Kabuto] [Razer Man O' War] 
CPUMotherboardGraphicsRAM
i7 4770HQ Intel HM87 Express Chipset Geforce GTX 860M 8GB DDR3L 1600 MHz 
Hard DriveOptical DriveCoolingOS
Samsung SSD EVO DVD-RW Stock Windows 8.1 
Monitor
1920x1080 TN 
  hide details  
Reply
Snowdevil
(16 items)
 
ASUS G750JM
(9 items)
 
 
CPUMotherboardGraphicsGraphics
[i7 4790K @ 4.4 GHz (1.186v)] [Asus Sabertooth Z97 Mark S] [nVidia Geforce GTX 1080] [nVidia Geforce GTX 1080] 
RAMHard DriveCoolingOS
[G.Skill 32GB DDR3 2133 MHz] [Crucial MX100 256GB] [Phanteks PH-TC12DX] [Win 10.1 Pro] 
MonitorMonitorKeyboardPower
[LG 29UM65 (2560x1080)] [QNIX Evo II LED (2560x1440)] [WASD v2 Tenkeyless] [NZXT Hale90 v2 ] 
CaseMouseMouse PadAudio
[ThermalTake GT10 Snow Edition] [Razer Mamba - Chroma] [Razer Kabuto] [Razer Man O' War] 
CPUMotherboardGraphicsRAM
i7 4770HQ Intel HM87 Express Chipset Geforce GTX 860M 8GB DDR3L 1600 MHz 
Hard DriveOptical DriveCoolingOS
Samsung SSD EVO DVD-RW Stock Windows 8.1 
Monitor
1920x1080 TN 
  hide details  
Reply
post #15 of 37
Good read guys!
post #16 of 37
Quote:
Originally Posted by brettjv View Post
I've decided to remove this from the fail thread that it was on.

Here's my dissertation about the topic of microstutter, if anyone is interested

What Does It Look Like?
Microstutter manifests in the following way: you know how when sometimes while gaming you'll hit a graphically difficult section, and you can just 'feel' that your fps has dropped dramatically, and you look at your FPS meter and discover that, indeed, you've dropped from 60fps to 25fps?
My micro stutter happened at highers speeds as it has for many on this forum and on google.
Quote:
Originally Posted by brettjv View Post
You know the sensation ... you get that sort of 'stuck in molasses' feel, everything gets laggy: your movement, changing your view, everything you do no longer produces an instant response?

Well, when you have microstutter, you get that effect, even though the FPS on the meter shows a high-enough FPS that you 'shouldn't' be getting that sensation.
it depends on the game, some micro stutter just flickers, doesnt actually slow fps. its a mistiming of frames rendered. but doesnt necessarily mean a decrease in performance will result. Continue on.

Quote:
Originally Posted by brettjv View Post

For example, 30fps (w/o microstutter) on the large majority games generally feels pretty smooth, it's totally 'playable'. Xbox runs at 30fps, and it usually feels quite responsive, right?

Well, on a game that's exhibiting microstutter, 30fps can end up feeling like around 20fps. And at 20fps, everything feels all laggy and slide-showy.
microstutter doesnt always feel slow. its a stutter, not a laggg or system hang.
Quote:
Originally Posted by brettjv View Post
Behind the Scenes
On a setup without microstutter, lets say you analyzed the actual performance during a given second of gaming when your FPS meter is showing your game at 30fps.

And by 'analyze', I mean you study the 'time to render' for each of the 30 frames that were rendered over the course of that second (I'll refer to these values as 'frametimes' moving forward, although that's not the 100% accurate definition of the term).

In this analysis, you'd find that the large majority of frametimes would be 'huddled around' the value of .033 seconds, or 33 milliseconds.

(note that I'm saying here that an alternate 'unit of measure' for expressing 30 FPS is 33ms per frame, because 1/30 = .033 seconds = 33 milliseconds ... does that all make sense?)

Now, there'd be some frametimes at other other values besides 33ms of course, as this is an average over a whole second, but if you graphed out the count (Y= Value Count, X = Frametime) of each value of frametimes for those 30 frames, you'd see a normally-distributed (bell-shaped) curve huddled around the mean value of 33ms. You'd get a 'single-humped camel back' looking graph.

However, when microstutter is affecting you, what you'd see instead is a consistent, cyclical variation in the 'time to render' of each frame. Frame 1 = 25 ms (40fps), frame 2 = 50ms (20fps), Frame 3 = 25ms, Frame 4 = 50ms, etc, over and over again during the course of that second.

Thus, if you plotted out the counts of each frametime value, instead of the 'single-humped camel back' graph you'd get w/o microstutter, you get a 'double-humped camel back' graph when you have microstutter. The count of each of the frametime values would be clustered around two separate and distinct mean values.

You lost me with all those numbers and such ... can you simplify it?
To put it even more simply, the two scenarios can be summed up like this: without MS, ALL your frames are actually being rendered at right around 30fps (33ms/frame), whereas with MS, your frames are constantly alternating back and forth between 20fps (50ms/frame) and 40fps (25ms/frame).

Perception vs. Reality (or: Some Good News at Last!)
In both cases (with or without MS), your AVERAGE over time is still calculated by FRAPS as being 30fps, but your PERCEPTION of how the scene looks is very different.

In the 'microstuttering condition', you 'sense' the scene as running at the slower of the two recurring values, in this case, 20fps. Not only that, but your brain can pickup up on a dramatic, rapid fluctuation in FPS like a constant flipping from 20 to 40fps (repeatedly halving/doubling), and it produces a vague feeling of discomfort when you view a scene that's doing that.
Id love to see the sources for all that.

Yes fps can be measured in time. However where has it been shown the camel hump theory?
http://www.pcgameshardware.de/aid,63...fikkarte/Test/

Here the disclosed problem starts: multi-GPU systems often do not succeed in displaying regular distributed frames; it comes to large time intervals between individual images - "micro stuttering" - despite supposedly fluid frame rates. One example: frame 1 is followed by frame 2 after 10 milliseconds, which corresponds to a cadence of 100 Fps. Frame 3, however, is to be seen only 40 milliseconds afterwards, followed by frame 4 after 10 ms.

Your camel hump theory would suggest symmetrical timings. However the facts from this website show otherwise. Which is also more similar to the "microstutter" issue I was having. A sudden POP or BURST foward.

same here, its not a slow in fps, its a stutter...


This one is kind of both, looks low fps and a stutter.




Quote:
Originally Posted by brettjv View Post
I Thought You Said 'Good News'?
Now, fortunately, a players PERCEPTION of microstutter should be alleviated at higher framerates. This is because the DELTA (40-20 = 20 fps) in the two alternating framerates in a microstutter condition should roughly stay the same when fps goes up.
(according to your camel hump theory)
Quote:
Originally Posted by brettjv View Post

So if you lower the settings on the same game, such that you're running at 60fps, you will be looking at half your frames at 50fps, and half your frames at 70fps.

So, while you still technically HAVE the microstutter at 60fps, if we extrapolate from the earlier analysis, we conclude that we'll perceive that 60fps average as though it were 50fps ... MUCH more playable than 20fps ... plus it's much harder for your brain to pick up on a rapid fluctuation between 50fps and 70fps so you won't get that feeling of discomfort I mentioned above.

So, bottom-line MS will generally become imperceptible (NOT 'immeasurable') as FPS gets up into the FPS range where we normally like to play games (i.e. 60fps).

And the good news is that HAVING SLI puts you exactly in the right position to actually be ABLE to get those framerates up into the 60fps range, right?


Some Caveats/Further Explanation (if you're a total glutton for details)
In the interest of accuracy, I should point out that because the frametime fluctuations can be even larger than in my example above, it's by no means impossible to be able to 'see' microstutter at much higher FPS rates, because anytime that frametimes are in constant fluctuation/uneven, there is a chance that it will be 'noticeable', esp. to certain people who are sensitive to it.
im happy you added this since 3 days ago it was only at low fps...
Quote:
Originally Posted by brettjv View Post
TBH, from what I've read re: microstutter over the years, it seems as though anytime one makes any broad generalizations about 'what causes' microstutter or 'when it's going to happen' or 'when it will be noticeable', there are disagreements amongst people who are well-versed about the subject.

But besides the generalization I shared above, a couple other ideas that *seem* to be fairly-widely accepted (but NOT 100%, because NOTHING is when it comes to this subject) are these:

1) The closer your cards are to being 'maxed out' in terms of their capabilities in a given scenario, the more likely it becomes that microstutter will become noticeable.

2) This also means that effective framerate limiters (including v-sync) can definitely be helpful in reducing the likelihood of visible microstutter, because if they are 'kicking in', by definition that means that you are reserving additional gpu headroom and not maxing out the cards.

3) In a somewhat-related vein, it turns out that having a CPU that is wildly overpowered in comparison to the GPU's you have in SLI, the more likely it will be that you will experience noticeable microstutter, esp. if you don't use a framerate limiter of some kind.
I explained I had micro stutter on a core i7 920 and dual 8800gt's and you said I didnt know what micro stutter was.. Now its more likely on a over powered cpu system? My point exactly. I'll save it for the end.
[quote=brettjv;13119733]

And in a somewhat amusing twist, being CPU-bottlenecked actually helps reduce microstutter, as it's another type of framerate limiter that keeps the cards from getting 'tapped out'

4) The proper contents of the SLI profile for a given game, in particular the 'compatibility bits' property, are critical to reducing the effects of microstutter. So if you're using SLI profiles not specifically provided by nVidia to enable SLI on a given game (such as the various profile/game renaming or re-associating tricks, or making your own custom profiles with nVInspector, etc), the chances that you'll get visible microstutter will go up.
[/qoute]
I found just the opposite to be true. Obviously if you are set to nvidia recommended and still experience "microstutter", try something else.

In America's Army 3, I was able to completely eliminate microstutter by changing it to alternate frame render 1. However, I was then getting 30fps on average compared to 40-60 fps using a single card, and 90-100fps in regular (microstutter) sli.

In theory you can test each game and different nvidia settings. Which is why I posted on a related thread.. .
Quote:
Originally Posted by brettjv View Post
5) It's possible that certain 'scenes' or 'happenings' in games like explosions or complex scenes that cause the cards to suddenly struggle could cause microstutter to become noticeable.

This is for two reasons: One is because of the first reason I stated above. The Second is that, interestingly, the driver itself (partly by means of the 'compatibility bits' property of the SLI profile mentioned in 4) above) actually attempts to ameliorate the effects of microstutter. In order to 'time' the presentation of frames more consistently, it relies on predictive algorithms based on a small sample of the frametimes that immediately preceded the frames it's presenting 'now'.

But when FPS suddenly drops, the rearward-looking samples it's keeping become 'unrepresentative', so the driver loses it's ability to predict how long it should 'wait' to present the next frame in order to maintain consistent frametimes. This will be the case for a short time, until the current batch of 'past-tense' samples return to being more representative/predictive of the future-tense (if that makes sense ... sorry if that's technical and boring, but I found it interesting to find out )

However, because things like big explosions also inherently involve framerate drops as well, it's hard to judge how much of 'the effect' one perceives is MS, and what's just the framerate dropping.
I found it to be certain textures. in TF2 it was indoors of mostly. On Mafia 2 it was the leather jacket. If you have micro stutter, put on the leather jacket in mafia 2 and watch it stuttttterrrr. especially very close to buildings.

Quote:
Originally Posted by brettjv View Post

Final Notes, and What To Do if You Have It:
The whole problem, it's simple: Its only about the game ... and all your settings in the game ... and where you are in the FPS scale ... and the driver ... and the SLI profile settings ... and your gpus ... and your cpu ... basically it's just the way EVERYTHING involved in the gaming experience in SLI comes together. There are almost no things that are 'consistent' with this phenomenon, and the issues involved with it share traits with many OTHER issues ... so really, you're almost better off not even worrying about trying to figure it all out.

If you're having the symptoms of MS with a game in SLI, and only in SLI ... try v-sync, OC'ing your gfx card, and lowering settings. If the game has a frame-rate limiter built-in, perhaps through a config file or console command, try using it.

And if those don't work ... you can just turn off SLI for that game ... OR you try out SLI AA, which disables regular SLI and uses the 2nd card just to process crazy levels of AA ... and produces NO microstutter. Plus you can set SLI AA in a gaming profile so you don't have to manually turn SLI on and off when you play games that you don't want to run in SLI.

And remember that in the majority of games, 30FPS on single card (or with SLI AA) usually feels/looks about as smooth as 45 on a SLI setup does anyways

If you read this far, thanks for reading, and I hope you gathered something you didn't know before

And feel free to use this post as a reference for people who ask you about Microstutter in the Future ... esp. if they're not really easily bored
Why not try the most common and successful solution? Updating/rolling back your drivers.

I updated my drivers and noticed a huge increase in microstutter, simply roll them back and wait for the next version.

Other solutions you can try is changing your frame rendering mode in NVCP as I mentioned earlier.

You should try turning on maximum performance mode in NVCP.
Try changing the Multi-display/mixed-GPU acceleration
Also Maximum pre-rendered frames.

I even tried to overclock my PCI Express bus speed a little.

I suggest monitor your cards temps, make sure they arent getting warm, causing errors or thermal throttling. This can be confused for microstuttering.

Also check system memory. If you are maxing your ram and using a lot of page file, make sure it is the correct size and hard drives defragmented. This also can be mistaken for microstuttering.

And first step in all trouble shooting is to disable possible conflicts. Goto start, msconfig, and then to the startup tab. Write down checked items, and disable all. Then goto the services tab, check the box that says, "hide all microsoft services" write down the checked items, and disable all. Restart and attempt to game.

PS, you may need to leave some nvidia (video card driver) items checked. Usually those are just for control panels though.

In conclusion, I have experienced micro stutter on a corei7 920 with 2x8800gt, which did NOT have a micro stutter issue on a socket 939, opteron 180 system(same 8800gts). This would confirm that the over powered cpu theory may be somewhat correct. Even when attempting to pre render 8 frames, it did not help though, even when fps dropped considerably.

I would also suggest it may be a memory timing issue with older cards in newer systems. I just really dont think my stock 8800gt memory can transfer data to my 1600mhz triple channel fast enough. ..just an idea.

My suggestion is to ask nvidia to come out with more rendering options in their NVCP. Maybe offset timings. I dont know.

Im staying single GPU til they figure it out. Im not wasting any more time or money.

+1 rep for exposing microstutter
Funk5000
(13 items)
 
  
CPUMotherboardGraphicsRAM
Opteron 180 +4800 DFI Lanparty UT SLI-DR SLI XFX 8800GT Zalman Edition 512mb 2x1gb OCz plat edition pc3200 400 mhz cas 2.5 
Hard DriveOptical DriveOSMonitor
200gb pata, 2x250gb sata 16mb cache 22x DL dvd+-rw Windows Xp Westinghouse 22" lcd 
PowerCase
850w 69A@12v thermaltake Ultra Torque 
  hide details  
Reply
Funk5000
(13 items)
 
  
CPUMotherboardGraphicsRAM
Opteron 180 +4800 DFI Lanparty UT SLI-DR SLI XFX 8800GT Zalman Edition 512mb 2x1gb OCz plat edition pc3200 400 mhz cas 2.5 
Hard DriveOptical DriveOSMonitor
200gb pata, 2x250gb sata 16mb cache 22x DL dvd+-rw Windows Xp Westinghouse 22" lcd 
PowerCase
850w 69A@12v thermaltake Ultra Torque 
  hide details  
Reply
post #17 of 37
Thread Starter 
Wow, you actually can behave like civilized contributor FD. Well done

First off ... the stutter in the Stalker video ... that is SO NOT microstutter. That's just plain 'stutter'. It happens in all three Stalker games, no matter what you do. It has to do with data loading as you move around. It's completely unfixable, and happens on any video card setup.

Just because people put up videos on youtube and say 'ZOMGlookattehMICROSTUTTERZ!!1!!' doesn't mean they know what the hell they're talking about. Microstutter is really not the kind of thing that shows up well in a video UNLESS you do a side-by-side video with the game on single card vs dual card, both at the same FPS, doing the same things.

And the difference between the two will be that the single card looks 'smoother' and less laggy. And even that is relatively hard to notice. MS is something that you almost *have* to actually feel by being the one playing the game.

The scenario I was describing with the 'two camel hump graph' would be what you'd see if you graphed out the COUNTS of the following sets of elapsed 'times to render', which is the most TYPICAL scenario characterizing microstutter:

10ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 13ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 10ms, 30ms, 11ms, 32ms, 9ms, 35ms, 10ms, 24ms, 11ms, 31ms, 13ms etc.

If you graph out the counts of these values (x: milliseconds to render, y: count of instances of x value), you'll find that you get two 'humps' in the graph, one centered around 10ms, and one around 30ms.

But you are correct if you're trying to point out that you could see 3 or even 4 'humps'. Here's an example of values that would provide 3 humps:

10ms, 20ms, 30ms, 9ms, 19ms, 29ms, 11ms, 21ms, 31ms, etc.

This would also be microstutter, but this is a less common pattern (esp. w/two cards in SLI), and not likely to be as noticeable as the first.

An example of a set of values that is NOT indicative of microstutter is this:
10ms, 10ms, 12ms, 30ms, 20ms, 20ms 25ms, 50ms, 10ms, 50ms, 49ms, 40ms, 600ms, etc.

There's always a PATTERN, in other words, not random fluctations like these. And the 600ms at the end ... that's what a 'pop' or 'stutter' looks like.

So, in the most basic sense, microstutter involves some recurring pattern of unevenness in terms of 'time to render' of frames over a given period of time. If these render times are graphed, it will not be a 'normal' bell-shaped distribution that centers around one average value like a single card would do.

Now, I'm not going to go back and relive that despicable thread wherein you told me to 'go back to (fellating) my dog', and charmingly implied that the other members who were supporting my viewpoint were '(fellating) Brett' nor am I going to continue the debate about what was said.

But I will say this: microstutter is NOT characterized by a 'popping' effect. We're talking about a phenomenon where the timeframes involved are milliseconds. We're talking about things that are happening way too fast for the eyes to see anything like a 'pop' going on.

You said:
Quote:
its a stutter, not a laggg
And I'm sorry, but you are incorrect. Microstutter absolutely DOES feel like lag, it is NOT stutter, or stuttering. It is *primarily* manifested by the perception that you are playing at a framerate that is much lower than the 'displayed' framerate. It's a feeling like being stuck in molasses, where you have a slow onscreen response between mouse movement and screen movement.

I think that you are convinced that you 'have' microstutter (which you certainly may at times) and therefore you're experiencing OTHER anomalies as being 'part and parcel' of the 'microstutter'. But there really aren't a whole bunch of different ways in which microstutter is manifested.

This being said, aside from MS feeling like lag, it's also possible that someone might describe it as the game 'not feeling smooth' even w/o describing it as 'laggy' ... and that likely would be in the scenario where the framerate is very high (but frametimes are still varying in a dramatic and cyclical way). Maybe someone else might describe that same effect as the game being 'flickery', as that's kind of the same idea.

But any effect that could remotely be described as a 'pop' is just regular 'stutter'.

Now, it's certainly not impossible for one to get 'stutters' during periods when you are ALSO getting microstutter, but if 'the problem' you're getting in a particular game is one you'd describe as 'popping' ... the problem is NOT properly described as microstutter. That's why it's called microstutter.

And that is why I told you that you don't know what microstutter is, NOT because you said you have it at 400fps.

AFA your other suggestions go, sure, practically ANYthing is worth a try. For example, there's no reason to NOT try changing between the render modes they expose for you to play with. It's certainly possible that nVidia chose the render mode to put in the profile based on pure performance, in order to make their cards look good in published benchmarks. Switching render modes might, in rare cases, reduce microstutter.

But trying to make your own SLI profile based on creating a new, empty game profile in NVCP and playing around with the SLI Render Mode is nearly certain to produce unfavorable results. Reason being, you cannot set the 'Compatibility Bits' property through NVCP ... yet this is THE CRITICAL property of the SLI profile. It's the primary method by which the driver is customized to the particular traits of the game in order to make SLI work properly, especially with regards to proper frame synching.

A SLI profile consisting of nothing but the choice of one particular render mode ... is really not a SLI profile at all, and 99% certain to work much worse (or not at all) vs. a SLI profile created by nVidia for the particular game. I would submit to you that if you are in the habit of trying to make your own ad-hoc SLI profiles to override what nV has made, that may well be why you find installing a different driver helps the situation ... you may be restoring SLI profiles back to defaults. Just an idea.

Also ... seriously, man ... if putting on the leather jacket in Mafia 2 causes lag ... that's a freaking bug in the game dude. A specific texture should NOT be something that 'causes' microstutter. But I'll tell you what ... I've pointed you to instructions how to empirically test for microstutter, why don't you try TESTING this game section, and seeing if, indeed, the frametime values are behaving in a manner consistent with microstutter?

Everyone has problems with Mafia 2, esp. lag when you're indoors ... low fps, AND low gpu usage. If you don't have a good quality (8800GT or above) DEDICATED physX card in Mafia 2 ... there's a lot of parts that just plain don't run well. The game is coded so that Cloth physX always runs on the CPU unless you have a dedicated physX card ... and there's parts of the game with a LOT of cloth to calculate ... some of it you can't even see, but it still gets calc'd.

Lastly, certainly, until such time as nV and AMD decide to at least make it possible for the user to CHOOSE to favor an even frametime distribution over pure performance when running in SLI, I would suggest to anyone that is sensitive to the effects of microstutter to steer clear of multi-GPU
Edited by brettjv - 4/15/11 at 11:12am
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #18 of 37
Quote:
Originally Posted by RagingCain View Post
Microstutter greatly influences the persons input due to hand eye coordination. First-Person Shooters are greatly affected because the more fluid motion there is, the better ones ability to line up those headshots is. You may have all the skill in the world, but microstutter is like moving a sensitive mouse over a gravel road. Similar effect happens to a lady as she is applying makeup in a bouncy car. In one case, that lined up headshot ends up being 3ft off as the "stutter" occurs, the other makes you look like a clown. I will leave you to figure out which is which

I would also wager, that it also may have to do with being a decade apart in age. You are beginning to slow down in your responses, even if its mentally. Judging by you preferred gaming choices, I wouldn't hesitate that your tastes for slower gaming has increased, more strategy oriented games like Civilization 5, and more RPGs that are slow paced such as Baldur's Gate II.
Your thinking does make sense in a way, but I'd rather ascribe it to the fact my brother and I are from different generations, considered the facts I mentioned aren't something recent, as far as I can recollect it has ALWAYS been like that.

Something like my brother has a higher sensitivity (or the other way round and I had a lower sensitivity to MS. Now granted perception's different for everyone, why while playing/watching a game SOME will detect MS and be bothered by that and SOME OTHER will NOT detect MS unless it is blatant?
F A T H E R
(16 items)
 
S O N
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 930 Rampage II Extreme NVIDIA GeForce GTX 980 Corsair  
RAMRAMRAMHard Drive
Corsair  Corsair  Corsair  2x120GB SSD 2x4TB Seagate Mirrored, 2x4TB Seaga... 
Optical DriveOSMonitorKeyboard
LG Blu-Ray Recorder Windows 7 Ultimate x64 Dell 30" (2560x1600) Logitech G19 
PowerCaseMouseMouse Pad
Corsair 1200W Thermaltake Element V Logitech G9x The Barracuda 
CPUMotherboardGraphicsRAM
Intel Pentium Dual-Core G2020 @ 2.5 Ghz ASRock H77 Pro/MVP Nvidia GTX 580 2x2GB DDR-3 Ballistics 
Hard DriveCoolingOSMonitor
1xSSD 120GB; 1xHDD 300 GB, 1xHDD 500 GB Stock Windows 7 Ultimate (x64) Monitor 22" 
KeyboardPowerCaseMouse
Microsoft Keyboard EVGA 600W EB Cooler Master Logitech G500 
Mouse Pad
I-Chill Mousepad 
  hide details  
Reply
F A T H E R
(16 items)
 
S O N
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 930 Rampage II Extreme NVIDIA GeForce GTX 980 Corsair  
RAMRAMRAMHard Drive
Corsair  Corsair  Corsair  2x120GB SSD 2x4TB Seagate Mirrored, 2x4TB Seaga... 
Optical DriveOSMonitorKeyboard
LG Blu-Ray Recorder Windows 7 Ultimate x64 Dell 30" (2560x1600) Logitech G19 
PowerCaseMouseMouse Pad
Corsair 1200W Thermaltake Element V Logitech G9x The Barracuda 
CPUMotherboardGraphicsRAM
Intel Pentium Dual-Core G2020 @ 2.5 Ghz ASRock H77 Pro/MVP Nvidia GTX 580 2x2GB DDR-3 Ballistics 
Hard DriveCoolingOSMonitor
1xSSD 120GB; 1xHDD 300 GB, 1xHDD 500 GB Stock Windows 7 Ultimate (x64) Monitor 22" 
KeyboardPowerCaseMouse
Microsoft Keyboard EVGA 600W EB Cooler Master Logitech G500 
Mouse Pad
I-Chill Mousepad 
  hide details  
Reply
post #19 of 37
Thread Starter 
Quote:
Originally Posted by Mafia2020 View Post
Something like my brother has a higher sensitivity (or the other way round and I had a lower sensitivity to MS. Now granted perception's different for everyone, why while playing/watching a game SOME will detect MS and be bothered by that and SOME OTHER will NOT detect MS unless it is blatant?
All I can say is that when microstutter is occurring at a LOW fps, it becomes obvious to anyone.

There is a definite, obvious sensation of 'lag', of the responsiveness of the game feeling significantly slower than what would be expected at the framerate that FRAPS is telling you that it is.

A STEADY 30fps SHOULD feel quite smooth. Not perfect, but not a lag-fest. But if you have microstutter, and in reality that 30fps is derived from a rapid fluctuation between (say) 10fps and 50fps on each an every frame (which is horribly bad MS), the game will look and feel laggy as all hell ... like it's running at 10fps. This sort of microstutter will be obvious to all.

However, when microstutter is occurring at a HIGH framerate, (and by occurring, I mean you can measure it empirically, and see the cyclical fluctuation in 'render times') ... that is the scenario where there's a lot of variability: some people are sensitive to it and experience a sense that something is 'off', that the game is 'not smooth' ... whereas it 'looks fine' to someone else.

It IS an interesting question though, what DOES make some people sensitive to the perception of high-fps microstutter, and some people ... not

Myself ... the only time I notice microstutter is when my FPS is low. I can TOTALLY tell that 30fps on my machine w/SLI disabled looks much smoother than 30fps on my rig in SLI, in the majority of games. But 60fps looks exactly the same to me. And this is how the majority of people are with regards to the whole thing.
Edited by brettjv - 4/15/11 at 10:39am
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
xeon X5675 6-core @ 4.1ghz (1.29v, 20x205 +ht ) rampage iii extreme msi rx470 gaming X (the $159 budget king) 3 x 2gb corsair xms3 pc12800 (9-9-9-24-1T@1600MHz) 
Hard DriveOptical DriveCoolingOS
hynix 250gb ssd (boot), 2tb deskstar (apps),1tb... plextor px-712sa - still the best optical drive... corsair h8o v2 aio W10 home 
MonitorPowerCaseAudio
asus vw266h 25.5" (1920x1200) abs sl (enermax revolution) * single 70A rail 850w silverstone rv-03 XFi Titanium 
  hide details  
Reply
post #20 of 37
Hm.

On the idea that the better the SLI scaling, the worse the potential microstutter...

The closer scaling becomes to 100%, the closer it becomes to each card delivering equal quantities of frames, alternately... just as the name 'Alternate Frame Rendering' would imply.

And here is where, I suspect, latency comes and rears its ugly head. It's not a bandwidth issue, as that is more than covered by PCI-E 2.0 and the SLI bridge (for which I still can't find a solid figure I actually believe for its actual bandwidth impact).

Now, this is purely for a multiple-cards-feeds-to-one-screen scenario, as I haven't seen microstutter on Surround, but I'm fairly sure that latencies will have a dual impact; firstly, on getting the frames rendered by the second/third/fourth card to the first card for output to the monitor, and secondly on the timing control/crosstalk between the cards and the driver.

Lets take a best-case setup, where the SLI bridge does all the work:

Frame 1 is rendered by Card 1; it is sent to the monitor with minimal latency, except that exhibited by the process of constructing the frame in the GPU, and sending it off via the DVI port.

Frame 2 is rendered by Card 2; it is rendered on the second card, and piped through the SLI bridge to the first card, which has to recieve it, analyse where it should be slotted in amongst the frames it is already in the process of rendering, then it goes to the DVI output.

If, however, the first card has moved on past where that frame would render... what does the driver do? Does it just slot the frame in at the first available space? No, that would look terrible. So it just drops it, and relies on Card 1 to feed the monitor the next frame as fast as possible.

So you get a jerk in the fps as a frame vanishes into the ether. It is sub-second, so has no visible impact on reported framerate, but is evident if you're sensitive. I don't know whether it would even be reported as a dropped frame.

Now, obviously, that doesn't happen all the time. But in games that exhibit microstutter, it's a common enough occurance for it to be noticable to the viewer.

Now lets look at another potential case, and the one I favour, given that SLI does appear to work without a bridge... most of the time...

Frame 1 is rendered by Card 1; it is sent to the monitor with minimal latency, except that exhibited by the process of constructing the frame in the GPU, and sending it off via the DVI port.

Frame 2 is rendered by Card 2; it is constructed on the second card, and send via the PCI-E bus to the PCI-E controller (on many modern systems on the CPU) which analyses where it needs to be sent, and sends it on it's way to the first GPU, where it is analysed, slotted in to the queue appropriately, and rendered... or dropped.

Obviously, the second route is going to take a lot, lot longer... and greatly increases the chances that Card 1 will have moved on from where the frame actually needed to be.

Now, this plainly ignores the overheads involved of the two cards 'talking' - both to one another and the driver - to remain in synchronicity. And I would imagine that that is a lot of overhead. Which, given that the PCI-E bus has to do other stuff quite a bit too, and to avoid too much load on the CPU, I would imagine is carried out over the SLI bridge. Which would explain why SLI without a bridge works... but not quite so well (in my experience, I see lower framerates) as with a bridge. And starts to exhibit problems when you're talking about the more high-powered cards. When the cards talk to the driver, they do it via PCI-E bus. But to each other, once they know what they're doing? Over the SLI bridge, unless that is impossible.

...

Anyway, that mostly explains what I see in two-card-single-screen SLI (I've never tried three, but the situation should be similar... but worse). The driver tries to feed the second card only stuff that it has calculated won't need to be abandoned, as that is wasted time and effort. Which is why scaling isn't perfect in AFR - card 1 is rendering more frames than card 2. The closer scaling gets to perfect... the worse the potential becomes for dropped frames... particularly if the PCI-E bus has a lot of traffic, like with HDD access, audio, networking... all the stuff that happens when you game.

The way around this would be more tightly regulated communication between the cards - and what 3dfx used to do - Scan Line Interleaving; one card calculates the odd lines, one the even lines. This more evenly balances the load, but on modern high-power systems would have a fairly hefty latency impact, I'm thinking, to make sure they were always in sync.

...

Anyway, this is mostly just me thinking out loud. Or onto a keyboard...
Aoi
(20 items)
 
Midori
(14 items)
 
 
CPUMotherboardGraphicsRAM
Core i7 920 D0 Gigabyte G1.Killer Guerilla GTX670 4GB SLI 24GB Corsair Vengeance 
Hard DriveHard DriveOptical DriveCooling
WD Velociraptor Samsung F1 Blu-ray XL Corsair H70 
OSMonitorMonitorMonitor
Windows 7 Professional x64 Dell 2405FPW Dell U2410 Dell 2405FPW 
MonitorKeyboardPowerCase
Dell U2311H Microsoft Sidewinder X4 Silverstone Strider 1kw Corsair 700D 
MouseMouse PadAudioOther
Logitech G500 Ozone XL Integrated Logitech G13 
CPUMotherboardGraphicsRAM
Core i5 3570K Asus P8Z77-M Pro nVidia GTX680 Corsair Vengeance LP 16GB 
Hard DriveOptical DriveOSOS
WD Velociraptor 600GB Samsung DVD+RW Windows 7 Home Premium x64 Ubuntu Server Customised 
MonitorKeyboardPowerCase
Triple Dell U2412M Sidewinder X6 Corsair TX750 Fractal Design R4 
Mouse
Logitech G700 
  hide details  
Reply
Aoi
(20 items)
 
Midori
(14 items)
 
 
CPUMotherboardGraphicsRAM
Core i7 920 D0 Gigabyte G1.Killer Guerilla GTX670 4GB SLI 24GB Corsair Vengeance 
Hard DriveHard DriveOptical DriveCooling
WD Velociraptor Samsung F1 Blu-ray XL Corsair H70 
OSMonitorMonitorMonitor
Windows 7 Professional x64 Dell 2405FPW Dell U2410 Dell 2405FPW 
MonitorKeyboardPowerCase
Dell U2311H Microsoft Sidewinder X4 Silverstone Strider 1kw Corsair 700D 
MouseMouse PadAudioOther
Logitech G500 Ozone XL Integrated Logitech G13 
CPUMotherboardGraphicsRAM
Core i5 3570K Asus P8Z77-M Pro nVidia GTX680 Corsair Vengeance LP 16GB 
Hard DriveOptical DriveOSOS
WD Velociraptor 600GB Samsung DVD+RW Windows 7 Home Premium x64 Ubuntu Server Customised 
MonitorKeyboardPowerCase
Triple Dell U2412M Sidewinder X6 Corsair TX750 Fractal Design R4 
Mouse
Logitech G700 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
Overclock.net › Forums › Graphics Cards › NVIDIA › brettjv's Microstutter General Information Thread