Overclock.net - An Overclocking Community - View Single Post - Frametime Analysis - .1% lows seem meaningless

View Single Post
post #17 of (permalink) Old 05-18-2019, 11:11 AM
JackCY
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,769
Rep: 333 (Unique: 238)
An output that jumps between 10fps and 300fps fast and constantly, will look pretty bad despite having 155fps average fps. What is a threshold of perception, it differs for each observer.

Min max has always been fairly useless metric. Hence some use .1% and 1% lows instead eventhough having precise time plot and a histogram is the way to go. It is far easier for people to understand and for reports to show a single number than to try and cram 40 plots onto one page/video frame. Some reviewers offer plots for some tests. Guru3D.

.1% and 1% suffer the same issue as min/max as well in that the time length of spikes is important as the longer it is the more perceptible it will be. All in all a custom metric would have to be agreed upon by reviewers that would return a quality of the measured output, the longer the low dips the worse, the higher the overall variance the worse, ... Not a single reviewer uses such metric and people in general don't want to agree upon such things. Best bet would be to offer a frame time measurement application that offers this metric as it's output, since most reviewers use commonly available (often poor) measuring apps and do not develop their own. Such as FRAPS that will only measure something but not everything, plus it's hooking (as any other hooking/injecting app) can break an application.

At best a complex, expensive solution that captures GPU output is needed. There is FCAT but it's Nvidia's solution = not trustworthy and as far as I know it also hooks/injets to add colored borders/markers and what not.
JackCY is offline