Overclock.net - An Overclocking Community - Reply to Topic

Thread: Frametime Analysis - .1% lows seem meaningless Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
05-18-2019 11:48 AM
cssorkinman
Quote: Originally Posted by JackCY View Post
An output that jumps between 10fps and 300fps fast and constantly, will look pretty bad despite having 155fps average fps. What is a threshold of perception, it differs for each observer.

Min max has always been fairly useless metric. Hence some use .1% and 1% lows instead eventhough having precise time plot and a histogram is the way to go. It is far easier for people to understand and for reports to show a single number than to try and cram 40 plots onto one page/video frame. Some reviewers offer plots for some tests. Guru3D.

.1% and 1% suffer the same issue as min/max as well in that the time length of spikes is important as the longer it is the more perceptible it will be. All in all a custom metric would have to be agreed upon by reviewers that would return a quality of the measured output, the longer the low dips the worse, the higher the overall variance the worse, ... Not a single reviewer uses such metric and people in general don't want to agree upon such things. Best bet would be to offer a frame time measurement application that offers this metric as it's output, since most reviewers use commonly available (often poor) measuring apps and do not develop their own. Such as FRAPS that will only measure something but not everything, plus it's hooking (as any other hooking/injecting app) can break an application.

At best a complex, expensive solution that captures GPU output is needed. There is FCAT but it's Nvidia's solution = not trustworthy and as far as I know it also hooks/injets to add colored borders/markers and what not.
Appreciate your well thought out replies.

Seems like we agree there should be a better way .

It does bother me to see the " experts" award the performance crown based upon those figures.

On the other end of the spectrum I've also seen scenarios where maximum fps in a benchmark skews the average fps so badly it makes them less useful as well.

Could also filter out those absolute high fps numbers of questionable value and see how that affects cpu comparisons for average fps.

Putzing around with excel spreadsheets is such a crappy winter's day project..... it may be a while before I tackle that.
05-18-2019 11:11 AM
JackCY An output that jumps between 10fps and 300fps fast and constantly, will look pretty bad despite having 155fps average fps. What is a threshold of perception, it differs for each observer.

Min max has always been fairly useless metric. Hence some use .1% and 1% lows instead eventhough having precise time plot and a histogram is the way to go. It is far easier for people to understand and for reports to show a single number than to try and cram 40 plots onto one page/video frame. Some reviewers offer plots for some tests. Guru3D.

.1% and 1% suffer the same issue as min/max as well in that the time length of spikes is important as the longer it is the more perceptible it will be. All in all a custom metric would have to be agreed upon by reviewers that would return a quality of the measured output, the longer the low dips the worse, the higher the overall variance the worse, ... Not a single reviewer uses such metric and people in general don't want to agree upon such things. Best bet would be to offer a frame time measurement application that offers this metric as it's output, since most reviewers use commonly available (often poor) measuring apps and do not develop their own. Such as FRAPS that will only measure something but not everything, plus it's hooking (as any other hooking/injecting app) can break an application.

At best a complex, expensive solution that captures GPU output is needed. There is FCAT but it's Nvidia's solution = not trustworthy and as far as I know it also hooks/injets to add colored borders/markers and what not.
05-18-2019 09:13 AM
cssorkinman
Quote: Originally Posted by JackCY View Post
Ideally you want output that has as little variance as possible.
Minimum can show you absolute lowest spike.
.1% a part of minimums, 1% and so on, ...

At best times, fps viewed as histogram.
Ideally as long as the minimum output is above the threshold of perception . If that is true, the absolute minimum seems meaningless.
05-18-2019 08:42 AM
JackCY Ideally you want output that has as little variance as possible.
Minimum can show you absolute lowest spike.
.1% a part of minimums, 1% and so on, ...

At best times, fps viewed as histogram.
05-18-2019 08:29 AM
cssorkinman Demonstration of why I feel the way I do about these measurements.

Very similar numbers generated , no noticeable fps slow down in one , in the other the dip was as obvious as a hippo riding shotgun in your Prius.
11-07-2017 01:50 PM
cssorkinman
Quote:
Originally Posted by Mr.N00bLaR View Post

Also another GN feference, but Steve shows in a few titles that the game engines have a maximum frame rate despite having an even faster gpu, especially at lower resolutions. Seems like it matters which games you are comparing at 1080 when you say clobbered.

I think something is really odd about the 1080ti's reported usage % Need to get one so I can figure out what kind of a$$hattery is going on with them.
11-07-2017 01:34 PM
Mr.N00bLaR Also another GN feference, but Steve shows in a few titles that the game engines have a maximum frame rate despite having an even faster gpu, especially at lower resolutions. Seems like it matters which games you are comparing at 1080 when you say clobbered.
11-07-2017 05:03 AM
cssorkinman Bumping into the frame cap is producing the " seismic" activity on frafs tongue.gif

120 fps cap medium graphics settings


200 fps cap medium settings.




Something else is bugging me about the reviewers - they keep claiming that they aren't gpu limited using a 1080ti at utlra settings @ 1080 res - if that's true , they should be clobbering my Fury regardless of lod settings.
11-06-2017 06:54 PM
cssorkinman
Quote:
Originally Posted by Blameless View Post

Quote:
Originally Posted by cssorkinman View Post

Would that account for frametimes that are lower than the 200 fps limit should allow?

It could.

I appreciate your time, thank you. thumb.gif

Now I'm wondering about the cap's effect on frametimes.

I'll keep playing with it
11-06-2017 06:26 PM
Blameless
Quote:
Originally Posted by cssorkinman View Post

Would that account for frametimes that are lower than the 200 fps limit should allow?

It could.
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off