post #1 of 1
Thread Starter 
The concept of a benchmark is well understood. Run a predefined set of tests on various hardware in various conditions and comapre the results.

When it comes to benchmarking a GPU... theres a lot of variables. The 3dmark series seems to do a good job of removing a lot of these. And we all generally know that AA and AF should be 'off' for these tests.

So... whats the bench LOD settings, in the context of those found under the driver tweaks of ATItool?

If i set mine to 10, the resulting FPS and bench results are significantly higher due to the reduced quality. If i set to -10, it runs like a dog. For gaming, i have found -1 to give the optimum balance between FPS/quality.

But i see people with similar equipment getting much higher bench results... as a result of this, have spent a lot of time investigating and tweaking the system.

Are we really comparing apples with apples? Should we not be disclosing LOD settings... or do we all assume a default value of 10? Considering the dramatic impact upon performance, i would put it out there that benchmark scores and comparison are completely useless without knowing what settings we have in place.

I could provide all my benches with LOD=10, but itd be a bit of a farce, as games look like absolute rubbish with this setting.
3
edit: woops, please move to ATI
Main rig
(9 items)
 
  
CPUMotherboardGraphicsRAM
i7 5930k Asus X99 Deluve Nvidia GTX 1080Ti  Corsair Dominator 3000Mhz 
Hard DriveCoolingOSPower
Intel 600p 1x420 rad, 1x480 rad, D5 Pump Windows 10 Pro x64 EVGA 1000w 
Case
Enthoo Primo 
  hide details  
Reply
Main rig
(9 items)
 
  
CPUMotherboardGraphicsRAM
i7 5930k Asus X99 Deluve Nvidia GTX 1080Ti  Corsair Dominator 3000Mhz 
Hard DriveCoolingOSPower
Intel 600p 1x420 rad, 1x480 rad, D5 Pump Windows 10 Pro x64 EVGA 1000w 
Case
Enthoo Primo 
  hide details  
Reply