I've been seeing a lot of review sites posting this information and on the surface , it appears to be quite useful. However after playing around with frame time analyzers using my own data it became apparent to me that it's not nearly as meaningful as it had appeared.
The case in point
166 fps minimum vs showing a .1% low of 100 because of single frames spread out over the entirety of a sample, ( out of 33,000 in 3 minutes of play) doesn't seem to be very representative of perceptible differences in performance.
Would any of you smart fellows out there have any thoughts on how to make better use of frametime data ? I was wondering how to use those 33000 data points to find any one second of gameplay with the lowest performance?
Any excel or open office wizards out there care to make a tool for doing this?
Perhaps fraps already does this to produce the 166 number? I've always assumed that it simply polls the fps during a given second and reports this instead.
Thanks in advance for any and all replies.