Overclock.net banner
1 - 20 of 294 Posts

· Premium Member
Joined
·
10,764 Posts
Discussion Starter · #1 ·
PC Perspective has posted the third in their series of articles detailing their new method of monitoring frame times using custom video capture hardware. This is just a preview to the main article, which initially was posted in their GTX Titan review, but it shows the initial results.
Quote:
If you haven't been following our sequence of stories that investigates a completely new testing methodology we are calling "frame rating", then you are really missing out. (Part 1 is here, part 2 is here.) The basic premise of Frame Rating is that the performance metrics that the industry is gathering using FRAPS are inaccurate in many cases and do not properly reflect the real-world gaming experience the user has.

Because of that, we are working on another method that uses high-end dual-link DVI capture equipment to directly record the raw output from the graphics card with an overlay technology that allows us to measure frame rates as they are presented on the screen, not as they are presented to the FRAPS software sub-system. With these tools we can measure average frame rates, frame times and stutter, all in a way that reflects exactly what the viewer sees from the game.

We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS.
Source

For reference, here's Part 1 and Part 2
 

· Waiting on New Hardware
Joined
·
4,444 Posts
pretty informative read.
looks like AMD CrossfireX is is not worthy of purchase for the forseeable future.
 

· Adclock.net
Joined
·
3,666 Posts
Quote:
Originally Posted by james8 View Post

pretty informative read.
looks like AMD CrossfireX is is not worthy of purchase for the forseeable future.
I'm actually really enjoying 7970 trifire right now.
 

· Waiting on New Hardware
Joined
·
4,444 Posts
^that's great. but it's known that 3 cards are usually very smooth, comparable to 1 card, for both SLI and CFX
it's the much more popular 2 card solution that is the problem
 

· PC Evangelist
Joined
·
48,889 Posts
I dont understand this. What they are saying CF is fake?
 

· Registered
Joined
·
3,137 Posts
Quote:
Originally Posted by james8 View Post

pretty informative read.
looks like AMD CrossfireX is is not worthy of purchase for the forseeable future.
multi gpu setup has never been worth it ever
 

· Registered
Joined
·
808 Posts
I'm curious as to how much one would notice the frametimes switching between 5ms and 15ms every other frame and if it actually is an issue or just numbers. Is it really something that can be noticed? Or is it just a graph that makes the AMD cards (especially cfx) look bad.

Good data overall I'm just not certain if it means all that much.
 

· PC Evangelist
Joined
·
48,889 Posts
Quote:
Originally Posted by TwilightEscape View Post

I'm curious as to how much one would notice the frametimes switching between 5ms and 15ms every other frame and if it actually is an issue or just numbers. Is it really something that can be noticed? Or is it just a graph that makes the AMD cards (especially cfx) look bad.

Good data overall I'm just not certain if it means all that much.
Not sure how much credit you can give them considering i have 2 x HD 7970 and id did a Fraps run in MP which is a lot more demanding and important. Frame Times did not look anywhere close to that they are getting.
 

· Never Finished
Joined
·
2,523 Posts

lol, my monitor just had a period.

@TwilightEscape: as long as the microstutter/fps is above the refresh rate of the monitor, you won't see the latency. 20ms is equivalent of 50fps, and 16.7ms is equivalent of 60fps. This shows the 7970 CF inbetween the two values, so I'd say that the latency would be noticeable, but not as bad as it would be if the fps was lower. Turn on vsync, or use some type of frame limiter, and that should fix that.

EDIT: ZealotKi11er: They use a different method of testing frametime latency that uses some type of card that directly records the latency from a capture card.
Quote:

At the heart of our unique GPU testing method is this card, a high-end dual-link DVI capture card capable of handling 2560x1600 resolutions at 60 Hz. Essentially this card will act as a monitor to our GPU test bed and allow us to capture the actual display output that reaches the gamer's eyes. This method is the best possible way to measure frame rates, frame times, stutter, runts, smoothness, and any other graphics-related metrics.
Source

With that said, pcper should have the most accurate frame-time latency testing setup out there.
 

· Registered
Joined
·
3,284 Posts
The biggest problem I have noticed is that I just don't see what there results produce. I have tri fire 7970s and Just don't see the stutter they are speaking of 80% of the time. Many 680 sli/690 users report the same thing, they just don't see it.

I personally believe that frame time analysis is far too subjective to be an objective measure of performance.

Microstutter does exist and certainly can ruin an experience. The problem is, micro stutter isn't proven to be solely a gpu problem. The amount of variables that can affect microstuttering is well beyond the threshold of being able to use it as a scoring metric.

Not only this, but does anyone notice how completely unexplained each result in each review is with pcper? There is no "good" or "bad" line drawn with frame times, just the assumption that percentile @ x Ms is worse then a lower number yet they give no actual visual review of what they see on the screen.

If they Compared each card visually on screen, review the smoothness without frame latenency #s, then went back and reviewed the latency, it would be a much more solid statistic. Where it stands now, without any sort "real life" review of what you see, these are pointless.

My biggest beef is the use of a slo-mo camera to capture the stuttering. For the love of god, if you can't see it in real time and must use a 240fps camera to capture it, then clearly it doesn't affect what the end user actually sees.
 

· Gaming Enthusiast
Joined
·
2,146 Posts
It's completely subjective. Some people wouldn't notice if their hair was on fire either, but that doesn't mean it's not smoking.

Personally, the AMD footage from reviews like this makes my eyes bleed. Just as SLI does at times with microstutter.
 

· Premium Member
Joined
·
10,764 Posts
Discussion Starter · #13 ·
Quote:
Originally Posted by ZealotKi11er View Post

I dont understand this. What they are saying CF is fake?
What they are saying is that CF throws up a lot of frames that only stay on the screen for a very short time (see the very small line in the bar on the left side of the screenshot) so it is counting frames for FPS that you don't really see. I don't know how much that matters though, but you can clearly see in their screenshots that the Nvidia cards have a more balanced time for each frame to appear on-screen. So you get a 45ms frame, a 15ms frame, and a 5ms frame with AMD, but a 25ms frame and a 35ms frame with Nvidia. At least that's how I read it, but I guess we'll have to wait for the full review.


Quote:
Originally Posted by th3illusiveman View Post

This is just getting silly now... how convenient for Nvidia that they test this only when that ridiculously expensive card needs all the justification it can for it's price since it's performance isn't doing any.
They started this testing weeks ago, before Titan was even announced. It has nothing to do with Titan - they didn't even test Titan with it yet.
 

· Registered
Joined
·
4,190 Posts
Quote:
Originally Posted by Forceman View Post

What they are saying is that CF throws up a lot of frames that only stay on the screen for a very short time (see the very small line in the bar on the left side of the screenshot) so it is counting frames for FPS that you don't really see. I don't know how much that matters though, but you can clearly see in their screenshots that the Nvidia cards have a more balanced time for each frame to appear on-screen. I guess we'll have to wait for the full review.


They started this testing weeks ago, before Titan was even announced. It has nothing to do with Titan - they didn't even test Titan with it yet.
are you telling me that someone who is getting 25fps-30fps and gets so annoyed by it they go out and buy another $400 GPU to boost performance wouldn't notice that they didn't get any? I'm sorry but if this is true it would have been reported ages ago. They are basically saying that you aren't getting any improvements in FPS when you run CFX and that can't be right. Add to that their latency testing (using Fraps) is worse than any other graph i've seen on the matter.

I have yet to run a CFX solution but i know the difference was night and day when i put a 2nd card in my system to run battlefield 3, it would be impossible to miss and a CFX user would notice the farce.
 

· Adclock.net
Joined
·
3,666 Posts
I managed to get MSI afterburners frame time thing working, I sit at 8.4ms constantly per frame if i'm vsynced to 120hz which is about right, I don't go above or below, Whatever these people are trying to point out, I'm not seeing it.
 

· Never Finished
Joined
·
2,523 Posts
Quote:
Originally Posted by jomama22 View Post

The biggest problem I have noticed is that I just don't see what there results produce. I have tri fire 7970s and Just don't see the stutter they are speaking of 80% of the time. Many 680 sli/690 users report the same thing, they just don't see it.
6x0 cards usually have less frame-time latency compared to AMD setups to begin with, so I wouldn't be surprised by that statement. Your trifire 7970 is probably pumping out so many fps that microstutter doesn't even matter anymore. Plus when your pumping out that much fps, it's hard to see microstutter period.
Quote:
I personally believe that frame time analysis is far too subjective to be an objective measure of performance.
Uh...frame-time analysis is objective. You just have to interpret the data to see what it really means. If you have trifire 7970s having microstutter at 5ms-15ms, and your monitor has a 60hz refresh rate (I see you have 120hz, btw), then you're not going to see any of the microstutter because the raw performance of the cards is able to hide it, as well as your monitor not supporting anything over 60fps.
Quote:
Microstutter does exist and certainly can ruin an experience. The problem is, micro stutter isn't proven to be solely a gpu problem. The amount of variables that can affect microstuttering is well beyond the threshold of being able to use it as a scoring metric.
Well, your right, to an extent, but if you take a look at all of the websites that do frame-time testing, you should see a pattern. AMD cards, on average, have more microstutter than NV cards, be it single card or not. The drivers that are being rolled out are slowly fixing that, though, and seeing as we got 7xxx/6xx until Q4, I wouldn't be surprised if the fix microstutter completely by then.
Quote:
Not only this, but does anyone notice how completely unexplained each result in each review is with pcper? There is no "good" or "bad" line drawn with frame times, just the assumption that percentile @ x Ms is worse then a lower number yet they give no actual visual review of what they see on the screen.

If they Compared each card visually on screen, review the smoothness without frame latenency #s, then went back and reviewed the latency, it would be a much more solid statistic. Where it stands now, without any sort "real life" review of what you see, these are pointless.
Well, it's their first time doing this stuff. Not to mention that they're probably expecting the reader to already have at least some knowhow on how to interpret frame-time latency-based graphs. If you know how to read the graphs, then a subjective analysis really isn't needed because all the data that you would need is right in front of you. It's the same thing in the headphone world. If you know how to read the frequency graphs, 500Hz square waves, 50Hz square waves, impedance vs sensitivity, etc. then you should know how a headphone sounds just by looking at the graphs. I will agree with you that having both is the best way to do it.
 

· Registered
Joined
·
893 Posts
Despite the Nvidia gfx card have a more uniform frametime, the reviewer notes that there could be some sort of smoothing going on before the image is outputted. Hence, why the AMD gfx often time has lower(best) ms time, however it also has consistent jumps from 0 to ~20ms. I am not quite sure if I understand frame rating wholly but from what I understand frame rating does not abide by the conventional Frame Per Second, since you could have frames that last 20 ms, 30 ms, and even 5 ms. The goal of all graphics card should be "0" ms since users want maximum frames, without looking at a single frame for too long, which means no frame lag. In the competitive sense an avg of 20 ms frames vs 0 ms frames can be thought of the same as 20 ms input lag vs 0 ms input lag. You don't want to see one image longer because it may have already been too late to react for the next input.

So, no, frame rating is definitely not subjective. Just because you do not notice frame lag, and it doesn't hinder your perceived performance. On a very micro level it actually does and should definitely be taken into consideration in the same manner your LCD panels are handled with things like refresh rate, response time and input lag. Just because the Nvidia has consistent frame rating which will handle micro stutter better, it could also introduce some sort of frame/input lag, which is also bad.
 

· Premium Member
Joined
·
10,764 Posts
Discussion Starter · #19 ·
Quote:
Originally Posted by th3illusiveman View Post

are you telling me that someone who is getting 25fps-30fps and gets to annoyed buy it they go out and buy another $400 GPU to boost performance wouldn't notice that they didn't get any? I'm sorry but if this is true it would have been reported ages ago. They are basically saying that you aren't getting any improvements in FPS when you run CFX and that can't be right. Add to that their latency testing (using Fraps) is worse than any other graph i've seen on the matter.
I'm not telling you anything. I'm just putting my interpretation of what they say in the article, which is that a lot of frames (when measured at the monitor, which is a significant difference) are on the screen for very small slices of time. And let's be honest, without FRAPS running, how many people can really tell the difference between 60 FPS and 80 FPS?

What they are showing with their screenshots is how long, in each 60ms monitor cycle, the frame is displayed. So that may be different from what other reviewers are getting with the FRAPS frame time-stamping. They show the comparison that illustrates that difference - there is still a portion of the display pipeline that occurs after FRAPS time-stamps the frame, so if the card is doing something at that part of the pipeline to affect the display time, FRAPS won't show it.
 

· Premium Member
Joined
·
2,791 Posts
Well, we'll see. The difference I saw when going from one 6950 to two felt like exactly what the frames suggested -- nearly a 100% increase in performance.

These results just make me feel as though there's something wrong with their methodology.
 
1 - 20 of 294 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top