Overclock.net › Forums › Industry News › Hardware News › [PCper] Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing
New Posts  All Forums:Forum Nav:

[PCper] Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing - Page 19

post #181 of 420
Quote:
Originally Posted by Kuivamaa View Post

When I say he has a point I mean that what we see isn't apples to apples.I don't know what is at play here but it the setups in question don't render the same thing. It is very evident in SD and less evident (bust still valid) in C3. Now at SD ,CFX stuttered more even indoors where image seemed similar. Nevertheless, I don't know who,where or what happens but the graphic difference is enough to botch the particular tests.

Not the first time there have been image differences between the companies, so I wouldn't jump to the Nvidia cheating conclusion anymore than the AMD cheating assumption about runt frames. I'd like to see some comparison videos from TR or Anand tech though. It's also unclear how those differences would affect latency as opposed to just performance.
post #182 of 420
I mean from my experience in way back in the day, Nvidia was known to kinda skimp on some graphical qualities in turn for better FPS and frame time deliveries whereas ATi was rendering full scenes with full graphical details vs. frame rate delivery. So all in all, it's like Nvidia = skimp a little on the edges for a better experience, ATi = skimp a little on frame rates for graphics quality. Again, this is my perspective from the pasts of both companies, and kind of what the graphs are saying regardless.
Up & Running
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel i3-4330 ASRock Fatal1ty Killer Z97 XFX RX 480 16gb G.Skill Ripjaws 1866 
Hard DriveCoolingOSMonitor
1TB Western Digital + WD 400gb + 120gb Intel SSD MCP-655 Vario + XSPC Raystorm + ThermoChill PA1... Windows 10 x64 28" Hanns-G 
KeyboardPowerCaseMouse
Worwolf Mechanical 1000HX Corsair Cooler Master Cosmos S Logitech Optical G400 
Audio
Novation Ultranova 
  hide details  
Reply
Up & Running
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel i3-4330 ASRock Fatal1ty Killer Z97 XFX RX 480 16gb G.Skill Ripjaws 1866 
Hard DriveCoolingOSMonitor
1TB Western Digital + WD 400gb + 120gb Intel SSD MCP-655 Vario + XSPC Raystorm + ThermoChill PA1... Windows 10 x64 28" Hanns-G 
KeyboardPowerCaseMouse
Worwolf Mechanical 1000HX Corsair Cooler Master Cosmos S Logitech Optical G400 
Audio
Novation Ultranova 
  hide details  
Reply
post #183 of 420
Quote:
Originally Posted by Sm0keydaBear View Post

I mean from my experience in way back in the day, Nvidia was known to kinda skimp on some graphical qualities in turn for better FPS and frame time deliveries whereas ATi was rendering full scenes with full graphical details vs. frame rate delivery. So all in all, it's like Nvidia = skimp a little on the edges for a better experience, ATi = skimp a little on frame rates for graphics quality. Again, this is my perspective from the pasts of both companies, and kind of what the graphs are saying regardless.

Crysis 3 is hard to notice but if you are looking for it you will find it.. As a matter of fact i wouldnt care less in motion it is not that noticeable except when you put both side to side, knowing how demanding crysis 3 is in shadows and LoD/DoF this is cheating none the less....



Not like sleeping dogs noticeable tho...
post #184 of 420
Are you sure the Youtube compression isn't affecting it? Even if you look at the HD stream, it is still compressed, and it just looks like a blurry mess to me. And again, just because there is a difference doesn't necessarily mean anyone is cheating - the graphics output is never identical, even with the same settings, because both sides do optimizations for thier pipelines.

Edit: I just watched the C3 video again, and I think the out-of-sync factor is huge. It is very hard to compare since it is never the same exact scene on both sides and the textures resolve at different times. If you look at the portion aruond 0:12 where he's standing in the same spot holding the gun, it looks like both sides are pretty much the same (as far as I can tell). Anywhere there is movement the out of sync-ness is messing up the comparison. Looks the same at the top of the waterfall near the end also.
Edited by Forceman - 3/28/13 at 4:46pm
post #185 of 420
Quote:
Originally Posted by Forceman View Post

Are you sure the Youtube compression isn't affecting it? Even if you look at the HD stream, it is still compressed. And again, just because there is a difference doesn't necessarily mean anyone is cheating - the graphics output is never identical, even with the same settings, because both sides do optimizations for thier pipelines.

Look at sleeping dogs video do not try to full screen you manage to catch more quick the differences..

Light sources, textures look how lights reflects on the floor when camera is spanning on cfx nvidia looks all blurry on all games tested for some weird reasons..

I even notice roaches on the floor lol
post #186 of 420
Yes, Sleeping Dogs is obvious with the missing light sources. I wonder if that's consistent on all the cards - does the 680/7970 video show the same thing, for example, or if that's a dynamic game quirk.
post #187 of 420
Quote:
Originally Posted by Forceman View Post

Yes, Sleeping Dogs is obvious with the missing light sources. I wonder if that's consistent on all the cards - does the 680/7970 video show the same thing, for example, or if that's a dynamic game quirk.

Even if its a dynamic game quirk. Why the 2 videos is outsync.You cant really compare. The one the char just walk forward the other the character turn the camera.
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
Workstation
(4 items)
 
  
CPUMotherboardGraphicsMonitor
Xeon E5-2690 Supermicro 2011 Nvidia GP100/ Vega FE Dell ultrasharp 4k 
  hide details  
Reply
post #188 of 420
In crysis 3 is just a matter of replaying the video over and over to notice the differences i just point where to look at on my pic..
post #189 of 420
I know, that's my point. You can't use the videos as posted to compare the quality differences. You'd need static screenshots for that. But the videos aren't supposed to be showing quality differences, they are supposed to be showing the "smoothness" differences, so we shouldn't get too wrapped up in it.
post #190 of 420
Quote:
Originally Posted by Forceman View Post

I know, that's my point. You can't use the videos as posted to compare the quality differences. You'd need static screenshots for that. But the videos aren't supposed to be showing quality differences, they are supposed to be showing the "smoothness" differences, so we shouldn't get too wrapped up in it.

How you can show the smoothness when one is showing more stuff on screen than the other which is my point...

I mean if i turn shadows from very high to high on my rig difference in performance is night and day..
Do i notice a visual difference? is very hard to tell if you are not really looking for it..

biggrin.gif

Im saying amd/nvidia dont have problems with multiple gpus setups? no...
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [PCper] Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing