[PCper] Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing - Page 7 - Overclock.net

Forum Jump: 
Reply
 
Thread Tools
post #61 of 420 Old 03-27-2013, 11:14 AM
Auto Overclocker
 
criminal's Avatar
 
Join Date: Mar 2008
Location: Alabama
Posts: 10,412
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 690
Quote:
Originally Posted by Alatar View Post

The problem with vsync solving the situation is that it only solves if you're pushing more frames than the refresh rate of your monitor, once it dips below that, you should still get some of the same symptoms. This should be an issue especially on 120Hz displays or on systems where you can't push 60fps.

Vsync also adds extra input lag and just like some people are sensitive to stutter, some people are also sensitive to input lag.

It's not really a solution, it changes the problem with another. Whether you think the problem that is created by vsync is acceptable is subjective.
Conveniently leaving out the part of the 1st quoted post where I said that after taking fps into account (assuming it's fine) you should focus on frame time spikes and so on to really see what kind of an experience you will get.

Take average fps as a general performance level, then check out the details about frame time spikes, very short frames etc.

Exactly. I cannot stand to play some games with vsync turned on. What I experience may not even be noticeable to someone else.

Edit: BTW, thanks Alatar. I think you helped convince me to get a Titan. It is one amazing card! smile.gif

I have a big bang house. I had all the materials for my new house brought in and dumped on a vacant lot. I took a stick of dynamite, lit the fuse on fire and tossed it into the pile of materials. BOOM went the explosion and when the dust settled I had a new house! Amazing isn't it? rolleyes.gif

thumb.gif
criminal is offline  
Sponsored Links
Advertisement
 
post #62 of 420 Old 03-27-2013, 11:14 AM
*cough* Stock *cough*
 
zGunBLADEz's Avatar
 
Join Date: Apr 2012
Location: Chicago
Posts: 2,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 95
Quote:
Originally Posted by sikkly View Post

Yes, and one of the biggest ones is vsync. It's a very well known, common issue. Why are you trying to hold on to your ideas so desperately? Even AMD has came out and said that their drivers need work in this area. I'd be willing to bet by the 8xx0 cards AMD will have their stuff figured out, and scaling will be much better.

What ideas? PCper report just didnt hold a candle to my reports back in the other thread? That i was right?

I mean what you want me to say now?

BTW: heres a topic on nvidia forums about TITAN frame latencys XD
https://forums.geforce.com/default/topic/535411/geforce-drivers/frame-latency-times-on-triple-screen-sli-titans/

Now is time for them to show us the drivers right amd/nvidia?

zGunBLADEz is offline  
post #63 of 420 Old 03-27-2013, 11:28 AM
 
Join Date: Mar 2009
Location: canada
Posts: 224
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Telimektar View Post

But nothing as much as Vsync does. It's just my experience, I almost never feel any input lag when gaming with Vsync off while it's utterly impossible for me when it's on with the exception of a few games. If you don't feel that you're actually very lucky and I wish I didn't too.

What I don't understand is that almost all console games use Vsync and I rarely feel input lag on them, or only a small amount. And it's not due to the controller difference, since I also feel the lag on PC games when using a controller.

It is probably due to your ATI card. I get horrible input lag with vsync (4890s) but I can 100% eliminate it by capping fps to 58 or 59.9-60 in each games config file or with afterburner.
freedumb is offline  
Sponsored Links
Advertisement
 
post #64 of 420 Old 03-27-2013, 11:33 AM
Commodore 64
 
SKYMTL's Avatar
 
Join Date: Jun 2008
Location: Montreal
Posts: 641
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 84
Quote:
Originally Posted by sugarhell View Post

Sky your opinion about the 3d guru boss and his opinion about frame capturing?

Its worth all this money for frame capturing? Its 100% accurate?What kind of anomalies can create a io issue?Thank you

From my understanding, he is using NVIDIA's method and gear like Anandtech and Toms are. Maybe I'm wrong but the process he just described mirrors exactly what Ryan from Anand posted, just condensed down into less than 2000 words.
SKYMTL is offline  
post #65 of 420 Old 03-27-2013, 11:39 AM
New to Overclock.net
 
sugarhell's Avatar
 
Join Date: Apr 2012
Location: 38.051289,23.709791
Posts: 9,310
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 565
Quote:
Originally Posted by SKYMTL View Post

From my understanding, he is using NVIDIA's method and gear like Anandtech and Toms are. Maybe I'm wrong but the process he just described mirrors exactly what Ryan from Anand posted, just condensed down into less than 2000 words.

Actually i wanted an opinion about the anomalies that he refers. Like the io issues or the software-hardware problems that create anomalies.This is because the whole frame capturing is immature or it just not 100% accurate about the user experience? I know that you will not use frame capturing.The optimal way to do the stuttering review for you what is?

Quote:
Originally Posted by BiG StroOnZ
Warning: Spoiler! (Click to show)
No I said if Vega is clocked @ 1600MHz out of the box, I will eat my shoe on Twitch.tv.

Sound good? wink.gif
Warning: Spoiler! (Click to show)
Quote:
Originally Posted by Juicin View Post
Not in games, which is what 99.9999 percent of us care about

And unless you own a business, you get paid for your time. So your compute power just makes more work for you not less

Ryzen is a joke to the vast majority of the market who would consider buying their product.
sugarhell is offline  
post #66 of 420 Old 03-27-2013, 11:44 AM
*cough* Stock *cough*
 
zGunBLADEz's Avatar
 
Join Date: Apr 2012
Location: Chicago
Posts: 2,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 95
Quote:
Originally Posted by sugarhell View Post

Actually i wanted an opinion about the anomalies that he refers. Like the io issues or the software-hardware problems that create anomalies.This is because the whole frame capturing is immature or it just not 100% accurate about the user experience? I know that you will not use frame capturing.The optimal way to do the stuttering review for you what is?

Take his word for it lol..??

Either that or do a fluid live recording..

zGunBLADEz is offline  
post #67 of 420 Old 03-27-2013, 11:44 AM
4.0ghz
 
Telimektar's Avatar
 
Join Date: Apr 2010
Location: France
Posts: 1,425
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 86
Quote:
Originally Posted by freedumb View Post

It is probably due to your ATI card. I get horrible input lag with vsync (4890s) but I can 100% eliminate it by capping fps to 58 or 59.9-60 in each games config file or with afterburner.

I'm sure it could be a part of that, I always had problem with that POS since day 1, but I also felt the same thing on my 8800GT and 8600GT before that, and I feel the same thing on my friends systems with 6870 and 6850. I just don't feel input lag, I also literally see it, like the mouse cursor is extremely "floaty" and doesn't answer right away in games with VSync enabled (most of them anyway).

Telimektar is offline  
post #68 of 420 Old 03-27-2013, 11:56 AM
New to Overclock.net
 
looniam's Avatar
 
Join Date: Apr 2009
Location: on a lake that is erie
Posts: 8,405
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 759
thats it!

i think i will now only browse youtube videos when it comes time to make my next graphics card purchase.

oh, everyone else already does that?


"Name as many uses for a brick as you can in one minute." - interview at graphics-chip maker Nvidia for a campaign-manager job
Fermi: it's better to burn out than fade away.
Remember the golden rule of statistics: A personal sample size of one is a sufficient basis upon which to draw universal conclusions.
looniam is offline  
post #69 of 420 Old 03-27-2013, 12:02 PM
4.4Ghz for Netflix
 
Brutuz's Avatar
 
Join Date: Jun 2007
Location: Ballarat, Australia
Posts: 17,190
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 564
Quote:
Originally Posted by Blameless View Post

Maybe, but it's often accurate. Crossfire has huge problems, and not just on the 7000 series.

In my experience with my 6970s the only games I have where two in crossfire is appreciably better than a single card are Crysis Warhead and Neverwinter Nights 2 (the latter of which does not need the boost). All my other games have the same or lesser smoothness with two cards. And yes, everything is working normally, and frame rates are increasing as they should. Still, no improvement in Crysis, Hawken, or half a dozen other games because of highly inconsistent frame intervals.

Older cards are often even worse, and it's been this way for ~8 years, at least in AFR mode.

That's weird, my friends CFX HD6870s have minimal microstutter, and my HD4890 CFX was pulling numbers near a stock GTX 480 with definite improvements in playability vs one card...Generally I've noticed with both SLI and CFX, you wait until 6 months to a year has passed before you actually buy the second card, not only do you save money but you also usually miss out on the lions share of issues.
Quote:
Originally Posted by Kane2207 View Post

After spending $400 for a second 7970, why should consumers have to resort to 3rd party software for something expected to work out of the box?

It definitely needs to be an option in CCC rather than requiring additional software, but I will also remind you that there's certain things you need to use 3rd party applications for with nVidia too, both companies have hidden driver options, IMO both companies should have an "enthusiast" settings panel which lets you alter any and all settings that may affect performance or IQ in their drivers, but that's not likely.
Quote:
Originally Posted by MoGTy View Post

Just a side note some of you might or might not like:
Yeah rolleyes.gif

Anyway, this will turn out for the better both on the red and green side, eventually.

Not sure why this isn't being quoted more...We still need more information on this, more sources, etc. (Yes I know Anand and others have reported the same thing, but as far as I can tell they were using a similar tool)

What would be really good would be if MS added something into Windows to monitor frame latency, FPS, etc easily, either that or game developers do it to the engines they use.
Quote:
Originally Posted by SKYMTL View Post

- FRAPS tells a PART of the story. Not all of it. Frame capture tells PART of the story. But not all of it. If I were you'd I'd read beyond the statements and actually see if these measurements actually have an impact upon the actual in-game EXPERIENCE. Some do, many don't.

It's exactly like IPC vs clock speed, you shouldn't get hung up on one or the other...Together you'll get a decent idea of performance, but apart you're not getting enough of the story to really make a good judgement.
Quote:
Originally Posted by sugarhell View Post

Sky your opinion about the 3d guru boss and his opinion about frame capturing?
Warning: Spoiler! (Click to show)
Originally Posted by Hilbert Hagedoorn
Actually its on par with what I always have stated. See the reason we started so late with frametime measurements is that FRAPS measures at gametime. Therefore it's bound to miss some stuff but can also show stuff that you as an end user can not see. However it is indicative of issues at hand, just not 100% precise. Its the reason why I really didn't want to use it. But an indication is always better then nothing, and combined with the request from you guys I inserted frametime recording as of recent.

However with that in mind -- Here's what I have been working on and wrote down in blog style as part of a future article:

----

For a couple of weeks now I have been working on a method as exposed on another website, with a framegrabber.

On my testing we have our traditional Game PC with the dedicated graphics card installed. We startup a game or benchmark sequence. The game is rendered, passes several stages and then each frame rendered is ready and served towards the monitor. It is precisely at that stage where we make a bypass.

The DVI-DL monitor output cable we connect towards a Dual Link DVI Distribution Amplifier (basically high resolution capable DVI switch). We connect out graphics card towards the input. Now the switch will clone the signal towards two outputs on that switch. One output we connect the monitor to but the second output we connect towards a framegrabber aka Video Capture Card.

Ours is a Single Channel 4 lane PCI Express bus with maximum data rate of 650MB/sec and support for a maximum canvas of 4kx4k HD video (we wanted to be a little future proof) capture for all progressive and interlaced DVI/HDMI modes. This card was 1500 EUR alone.

We are not there yet though as we need to place the framegrabber into a PC of course. Fast is good, so we are using a Z77 motherboard with Core i7 3770K processor. The encoding prices is managed by the processor on the frame grabber in real-time, to if IO is managed fast enough we'll have less then 5% CPU utilization while capturing 2560x1440 @ 60Hz streams in real-time.

Now we need to save the rendered frames in real-time, uncompressed as an AVI file. Now here's the problem:
Capturing at 1920x1080 @ 60 Hz in real-time shows IO writes of roughly 200~250 MB/s.
Capturing at 2560x1440 @ 60 Hz in real-time shows IO writes of roughly 400~475 MB/s.
Correct - That's ~450 MB each second continuesly (!)

The first time I notice that, yes I cursed and nearly vomited. At 2560x1440 The only way to tackle the real-time writes without clogging up system IO for the recording PC is to get multiple SATA3 SSDs setup in RAID Stripe mode. That will still create a CPU load somehow. So there is a more easy solution.

We contacted OCZ and asked them to send out the RevoDrive 3 X2. These PCie 4x based products have their own Hardware SSD and Raid controllers, thus lowering a lot of overhead for the PC. They can write sustained 500 MB/sec quite easily. And with 450 MB/sec writes (nearly a full GB for 2 seconds of recording, you'll need some storage volume space as well. So we got the 700 EUR 480 GB version. Which in theory will record 4-5 minutes before it's full.

But that's sufficient for our purposes. While doing all this high-end capturing we see a low CPU overhead of only 3-4%. Why am I so keen on low CPU utilization you might ask ? Because this is precise measuring and analyzing. We want to prevent accidentally recording dropped frames at all times. But yeah at this point we have spend like 3500 EUR alone on the frame grabber PC and a switch.


Once we have setup all the hardware we install the framegrabber. With supported software we can now tap in on the framegrabber and record the frames fired at us from that Game PC.

Recording an AVI and then what ?

Good question, we have the ability to grab and thus record all frames fired at the framegrabber PC. We record them in an AVI file. But that alone is not enough as how are you going to analyze date from an AVI file ?
So here science starts. We leave the framegrabber PC to rest for a minute and go back towards the Game PC that is rendering our game, benchmark or whatever.

On the game PC we have installed a small overlay utility with extremely low CPU overhead. Here's what is is doing, each frame that the GPU is rendering will get a colored bar assigned, example:

Frame 1 gets a Yellow colored bar top the left.
Frame 2 gets a Green colored bar top the left.
Frame 3 gets a Red colored bar top the left.
Frame 4 gets a Purple colored bar top the left.
Frame 5 gets a Blue colored bar top the left.

And so on ... so each rendered frame will have a color tag, that's simple enough to understand right ? Now we go back to the frame grabber PC and record our game play. The output of the Game PC including the color tags per frame from the overlay application is now recorded. Once we look at the AVI file, we can indeed see that with each frame that we pass we see a colored tag on the left side of the frame.

Going deeper
But that is still not enough right ? So here's where I'll simplify the explanation a little bit. We now fire off a perl script at the AVI. The Perl scrip will analyze the AVI file, each frame has a certain latency, each frame has a certain color and that way it can differentiate and distinguish frames and thus values from each other. It will output the date in an XML file. And once data is in an XML file, we can chart it.

We fire off another Perl script to make a nice chart out of the XML data and boom ... we have output that represent the frame experience you guys see and observe on your monitor.


So above just a quick part of that article. Unfortunately we are running into many issues software and hardware related. See if we want to catch frame experience / stutter issues then the above method is the only valid one. Unfortunately there is so much hardware and software involved that currently I see anomalies in the charts that should not be there. Even the RevoDrive 3 X2 is not fast enough as we see IO issues causing framedrops ... and that is the one thing that may not happen.

It will take a while (months) to get this refined. However I am fighting another problem, my work week is already 60+ hours, the methodology described above seriously EATS away time into tremendous numbers. So we're not sure how, if and when this new method will become effective. It would be a 2-3 page addition towards (on top of) our current reviews next to average framerates, for the sole reason to hunt down graphics anomalies.

Okay ... this post is too long ... but that said, we are working on a method that is accurate, measuring at DVI output is literally what you'll see on the monitor and thus on the charts. It's however complicated x10 and very time consuming.

Its worth all this money for frame capturing? Its 100% accurate?What kind of anomalies can create a io issue?Thank you

He's using a 3770k according to that post, along with a PCIe based frame-grabber and PCIe based SSD...I want to know if that's going to affect FPS, etc badly due to using extra PCIe bandwidth...I know AMD cards require more PCIe bandwidth than nVidia cards when in a multi-GPU setup as the SLI connector has double the bandwidth (1GB/s vs 500MB/s iirc) and even with a PLX bridge, you've only got so many lanes to the CPU..and if the frame-grabber and SSD are needing to talk to it enough that could definitely negatively affect the results, while I'm grateful as all hell they've spent that coin on good testing measures, it'd be much better if they had a Socket 2011 system IMO, stick the SSD and frame-grabber in 8x slots and leave two 16x slots for the GPUs. Of course, I may be wrong..but I do know that I noticed a decent improvement in game-playability when I put a second CFX bridge on my HD4890 CFX setup, rather than just having the single one. (Too old to really try and interpret it with newer cards, IMO, but it shouldn't be ignored...Someone needs to test HD7970 CFX with two bridges vs one and see if that changes anything.)

Brutuz is offline  
post #70 of 420 Old 03-27-2013, 12:05 PM
Commodore 64
 
SKYMTL's Avatar
 
Join Date: Jun 2008
Location: Montreal
Posts: 641
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 84
Quote:
Originally Posted by sugarhell View Post

Actually i wanted an opinion about the anomalies that he refers. Like the io issues or the software-hardware problems that create anomalies.This is because the whole frame capturing is immature or it just not 100% accurate about the user experience? I know that you will not use frame capturing.The optimal way to do the stuttering review for you what is?

Ah. Well, I think it raises another, relatively large question about any benchmarking process. At some point some piece of hardware will become a bottleneck. In the case of frame capturing, it could be the writing device or part of the system it's installed into, the CPU, RAM or OS install of the host PC, etc. The same goes for the FRAPS method, though that eliminates the capture PC but introduces its own variables which were discussed by Ryan.

Simply put, it's impossible to eliminate every variable. Rather, reviewers can do their best to reduce outside factors from becoming an influence upon the final result.

On the main test system, that means running a fast CPU, enough memory, a decently fast storage solution and running at a resolution that's appropriate for the card being benchmarked. Running a stock CPU and a $1000 GPU solution at 1080P is a damn joke if nothing else. In addition, as has been recently evident in Bioshock Infinite and some other titles, stutter can be caused by MANY outside factors (hard drive / SSD, CPU, memory bottleneck, etc) and there's just no way to know for certain what the issues' true root cause.

Some may argue that frame capture adds more variables into the equation since certain tangible latency increases can insert themselves into the equation. That's true but there's currently no other way to accurately capture output-source information for GPU reviews.
SKYMTL is offline  
Reply

Quick Reply

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off