Originally Posted by cssorkinman
The reason I think it isn't a good measure is because you would have to have 100 of those highest frametime images occur in the same one second segement of the bench in order to even come close to the threshold of human perception.
100 of those frames would be
a full second or more.
Given the frame time distribution of your graph (where every several frames you get a frame that double the frame time) I'd consider those 100fps and 112fps figures way more useful than the 166fps or 193fps figures.
The 0.1% frame rate lets the true outliers be culled and gives you a useful minimum performance figure.
In cssorkinman's case, he's not getting any severe dips like that, but if you had a true constant/fixed frame time 193fps output vs. a true 100fps output, the later would be closer to representing the smoothness he'd see than the former.
Now, at 100+fps it's smooth enough regardless and many people may well not be able to tell the difference between 193/166 apparent fps and 112/100 apparent fps, but that doesn't deminish the usefulness of the 0.1% lows as a measurement.
...rightful liberty is unobstructed action according to our will within limits drawn around us by the equal rights of others. I do not add 'within the limits of the law,' because law is often but the tyrant's will, and always so when it violates the right of an individual. -- Thomas Jefferson