Originally Posted by Mr.N00bLaR
Originally Posted by Blameless
FRAPS is counting the time of every frame.
166fps minimum is clearly incorrect, and is probably a 1s average. A ~20ms frame is a ~50fps minimum frame rate. The 1% and .01% frame rates are there to cull extreme outliers like that.
I'd argue, that in this case, the 0.1% is far more useful than that misleading average figure, or the single frame out of ~33k frames that falls significantly outside the rest.
I think GamersNexus does a good job at explaining it here: https://www.youtube.com/watch?v=uXepIWi4SgM
. I also have the g3258 and have seen what he's describing.
I understand what they are trying to express and I thank you for the link.
I guess it comes down to the way fraps is measuring it - does it keep track of the fps produced for every second and find the lowest and show that value as the min or is it just sampling every so often.
After looking at the min max ave spreadsheet produced it would seem that it is at least averaging across complete seconds ( possibly 10 from the look of it) and considers those .1% lows because it has the average fps being the same as the frame time analyzer.