Overclock.net › Forums › Industry News › Hardware News › [PCGHW]Fallout 4 in art test with benchmarks for release
New Posts  All Forums:Forum Nav:

[PCGHW]Fallout 4 in art test with benchmarks for release - Page 28

post #271 of 701
hopefully the drivers are fixed before December....
post #272 of 701
Quote:
Originally Posted by MadRabbit View Post

Or you know, reviewers could make two different pages then? One with AMD VS Nvidia without GW and then one for GW and Nvidia to show what you "win" with turning those options on in the first place. As for this game, you win absolutely nothing turning god rays to ultra but it sure as heck gimps AMD (and even Nvidia for some part)

And that is something reviews needs to consider. But they will also need to include image quality with and without. Since I'm not a reviewer, I just need to take everything into account.
And regarding godrays, I don't know. Maybe it is very subtle but from gameplay I have seen, in some cases godrays add a bit of depth for indoors. It is very subtle, and I'm sure most can skip it. It is like going from 8xAA to 4xAA. The difference is almost none existent, especially in high DPI monitors, yet people will swear on their dead grandma that they see every single pixel and they can't play without 8xAA, else they will get a stroke or something.
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #273 of 701
Quote:
Originally Posted by Klocek001 View Post

that's the whole problem. most reviews are just sloppy. first off all, the right way to do it is at what quality settings the card will run at 60 fps, not turning on everything and then comparing. nvidia publishes whole pages of interactive comparisons with estimated framerate toll on their website, even they tell you only to use the GW features that you want and need, not to use too many and ruin the performance. gw features should not be treated as obligatory, but high end cards should have benches with both gw on and gw off. I remember running 1440p / 60fps on my 290 was all about testing the settings, and in the end it was possible to achieve great visual quality and smooth gameplay, just do not turn everything on blindly. The only game it couldn't handle was TW3, even medium settings with gw off couldn't hold 60fps.

Or the dev should do what R* did with GTA V. Add two pipes, one for Nvidia and one AMD, see, problem solved.

@Defoler Now we're getting somewhere. Basically it all comes down to the dev and the reviewers. Both are either too lazy to do it or just don't want to do it. Nvidia offers the options, from that far it's the devs choice how to implement it.
Edited by MadRabbit - 11/10/15 at 12:46am
AMD
(13 items)
 
Intel
(7 items)
 
Home Server
(11 items)
 
CPUMotherboardGraphicsGraphics
AMD FX-8350 Asus M5A99FX Pro MSI Radeon R9-280x MSI Radeon R9-280x 
RAMHard DriveOptical DriveCooling
Crucial Ballistics 8GB DDR3 Cruical MX100 128GB SSD Samsung DVD-RW Cooler Master Hyper Evo 212 
OSMonitorKeyboardPower
Windows 10 Technical Preview Philips 55PFS6909/12 Logitech MX3200 Chieftec 750W 
Mouse
Khaos Limited Edition 
CPUMotherboardGraphicsRAM
Intel Core i5-4200U ACER BA50 AMD HD8750M 4GB DDR3 
Hard DriveOSMonitor
750GB HDD Windows 10 TP 15,6" 
CPUCPUMotherboardGraphics
AMD Opteron 2373EE AMD Opteron 2373EE Dell Socket Fr5 XGI® Z9s with 32MB DDRII VRAM 
RAMHard DriveOptical DriveCooling
32GB DDRII ECC 1TB HDD 7200rpm N/A Passive 
OSPowerCase
Ubuntu Server 600W Dell PowerEdge CS24-NV7 
  hide details  
Reply
AMD
(13 items)
 
Intel
(7 items)
 
Home Server
(11 items)
 
CPUMotherboardGraphicsGraphics
AMD FX-8350 Asus M5A99FX Pro MSI Radeon R9-280x MSI Radeon R9-280x 
RAMHard DriveOptical DriveCooling
Crucial Ballistics 8GB DDR3 Cruical MX100 128GB SSD Samsung DVD-RW Cooler Master Hyper Evo 212 
OSMonitorKeyboardPower
Windows 10 Technical Preview Philips 55PFS6909/12 Logitech MX3200 Chieftec 750W 
Mouse
Khaos Limited Edition 
CPUMotherboardGraphicsRAM
Intel Core i5-4200U ACER BA50 AMD HD8750M 4GB DDR3 
Hard DriveOSMonitor
750GB HDD Windows 10 TP 15,6" 
CPUCPUMotherboardGraphics
AMD Opteron 2373EE AMD Opteron 2373EE Dell Socket Fr5 XGI® Z9s with 32MB DDRII VRAM 
RAMHard DriveOptical DriveCooling
32GB DDRII ECC 1TB HDD 7200rpm N/A Passive 
OSPowerCase
Ubuntu Server 600W Dell PowerEdge CS24-NV7 
  hide details  
Reply
post #274 of 701
Quote:
Originally Posted by Defoler View Post

And that is something reviews needs to consider. But they will also need to include image quality with and without. Since I'm not a reviewer, I just need to take everything into account.
And regarding godrays, I don't know. Maybe it is very subtle but from gameplay I have seen, in some cases godrays add a bit of depth for indoors. It is very subtle, and I'm sure most can skip it. It is like going from 8xAA to 4xAA. The difference is almost none existent, especially in high DPI monitors, yet people will swear on their dead grandma that they see every single pixel and they can't play without 8xAA, else they will get a stroke or something.

A lot of times it's just the placebo/nocebo effect at play.
post #275 of 701
Quote:
Originally Posted by Klocek001 View Post

that's the whole problem. most reviews are just sloppy. first off all, the right way to do it is at what quality settings the card will run at 60 fps, not turning on everything and then comparing. nvidia publishes whole pages of interactive comparisons with estimated framerate toll on their website, even they tell you only to use the GW features that you want and need, not to use too many and ruin the performance. gw features should not be treated as obligatory, but high end cards should have benches with both gw on and gw off. I remember running 1440p / 60fps on my 290 was all about testing the settings, and in the end it was possible to achieve great visual quality and smooth gameplay, just do not turn everything on blindly. The only game it couldn't handle was TW3, even medium settings with gw off couldn't hold 60fps.

On the one side, I agree. Just turning on gameworks settings blindly, or any settings for that matter, is silly.
But on the other hand, you can't compare overall performance with different settings. it will just be apples vs oranges, because image quality and environmental look will be vastly different. So comparing apples to apples, same image quality, same environment, is more accurate.
If I want to run everything max, I want to know what card will be the best for me (apples vs apples). And if it is too much, I want to know what card is right for me for everything at medium (oranges vs oranges).

You can turn all settings off and low quality and run at 120fps with a 7970. That doesn't mean anything in comparison to a fury X or a 980 TI just based on the FPS, because image quality will be vastly different.
But also, turning settings on and off on every single setting for every card, can take several good days to test, if not weeks. And I doubt reviews have that much time to spend and test every card to give a 100% accurate results. Especially if suddenly a new driver comes out, and you need to start all over again. That way by the end of the year, a reviewer might not even finish. So they have to be semi-sloppy to get the review out.
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #276 of 701
Quote:
Originally Posted by Defoler View Post

On the one side, I agree. Just turning on gameworks settings blindly, or any settings for that matter, is silly.
But on the other hand, you can't compare overall performance with different settings. it will just be apples vs oranges, because image quality and environmental look will be vastly different. So comparing apples to apples, same image quality, same environment, is more accurate.
If I want to run everything max, I want to know what card will be the best for me (apples vs apples). And if it is too much, I want to know what card is right for me for everything at medium (oranges vs oranges).

You can turn all settings off and low quality and run at 120fps with a 7970. That doesn't mean anything in comparison to a fury X or a 980 TI just based on the FPS, because image quality will be vastly different.
But also, turning settings on and off on every single setting for every card, can take several good days to test, if not weeks. And I doubt reviews have that much time to spend and test every card to give a 100% accurate results. Especially if suddenly a new driver comes out, and you need to start all over again. That way by the end of the year, a reviewer might not even finish. So they have to be semi-sloppy to get the review out.

Tell that to Kyle/Brent over at HardOCP lol

To be fair they do include an "apples to apples" section but for the most part it's apples vs oranges.

(in case it's not clear I'm agreeing with the part in bold, but clearly 2 editors at a major hardware review site think otherwise)
post #277 of 701
runs fine for me on my r9 290, 130 fps at 1440p.
    
CPUGraphicsRAMHard Drive
i7-7820HK Kaby Lake @ 4.2Ghz GTX 1070 @ 2Ghz boost @ 9008Mhz Vram 16GB 2x8 DDR4 2133 Samsung 960 EVO 500GB NVMe 
Hard DriveCoolingOSMonitor
5TB Seagate 2.5" 5400RPM Grizzly Conductonaut on GPU and CPU Win 10 Bloatware Free Edition 17.3" 1080p IPS oc'd 100hz GSYNC 
  hide details  
Reply
    
CPUGraphicsRAMHard Drive
i7-7820HK Kaby Lake @ 4.2Ghz GTX 1070 @ 2Ghz boost @ 9008Mhz Vram 16GB 2x8 DDR4 2133 Samsung 960 EVO 500GB NVMe 
Hard DriveCoolingOSMonitor
5TB Seagate 2.5" 5400RPM Grizzly Conductonaut on GPU and CPU Win 10 Bloatware Free Edition 17.3" 1080p IPS oc'd 100hz GSYNC 
  hide details  
Reply
post #278 of 701
Quote:
Originally Posted by huzzug View Post

Wow Guess AMD will catch u as usual. Would like to know how the game is ? Also, since when did people started referring to GTX 970 as a 3.5G+.5G card

Since it was revealed that it has a 224+32 bit memory bus and you can only read or write from one at a time (so reading from the 32-bit would lock out the 224 for that cycle, making it almost never worth using)
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte Aorus Xtreme 1080ti Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
Reply
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte Aorus Xtreme 1080ti Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
Reply
post #279 of 701
Quote:
Originally Posted by MadRabbit View Post

@Defoler Now we're getting somewhere. Basically it all comes down to the dev and the reviewers. Both are either too lazy to do it or just don't want to do it. Nvidia offers the options, from that far it's the devs choice how to implement it.

I definitely agree.
Implementations of tech, how to use it, bugs and many performance issues, are most times issues coming from the game development, and less either AMD or Nvidia.
If through the config files settings you can reduce things which are not part of the in-game menus, and those things cause major havoc, than most of those settings were just implemented badly.
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #280 of 701
Quote:
Originally Posted by magnek View Post

Tell that to Kyle/Brent over at HardOCP lol

To be fair they do include an "apples to apples" section but for the most part it's apples vs oranges.

(in case it's not clear I'm agreeing with the part in bold, but clearly 2 editors at a major hardware review site think otherwise)

My main problem with them is that they are only showing pure numbers. Just fps means nothing if you reduce image quality or visual add ons for performance, without being very clear and visual about it.
If going apples vs oranges, I want them to add image quality comparisons. And if the images are 99.99% identical, it will make a valid point. When they are not, the apples vs oranges comparison is completely useless.
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [PCGHW]Fallout 4 in art test with benchmarks for release