Originally Posted by Carniflex
I can already imagine the mental gymnastics involved when rationalizing why a nVidia GPU should be combined with AMD CPU
Anyway on the topic - I'm pretty sure that if nVidia comes out to call out Intel over a specific benchmark they have triple checked their argument. This is not a claim to be done off-hand but needs something to back up their claim which is reproducible by third parties reliably (and these effected WILL check it most likely). This is not exactly a field where you can get away with just a catchy title to make an image stuck where no one looks up afterwards what was the end result of the dispute like in average "consumer" stuff.None of the companies relevant (AMD, Intel nor nVidia) have squeaky clean background as far as being fair in benchmarks goes
Yeah... you're right... AMD once used a texture filtering tweak, back in 2004, which was a mixture of bilinear and trilinear filtering (though closer to trilinear filtering) dubbed "Brilinear" filtering by the tech media.
Ironically... this "brilinear" filtering is now an industry standard. Unless you specify "Quality" setting in your nVIDIA control panel, as it pertains to texture filtering quality, you're using "Brilinear". Same goes for AMD. Unless you switch to Texture Quality "High" then you're using "Brilinear".
Another optimization that AMD have used is FP16 Demotion. Basically... FP16 demotion means the use of a 32-bit render target instead of a 64-bit render target in HDR rendering. You cannot tell the difference in image quality. Good luck trying. But the difference is there in that the GPU is doing less work. Once again... nVIDIA have implemented this same feature in their drivers. The difference? You can disable it in AMD drivers (used to be Catalyst AI and now it is "Surface Format Optimization" while you cannot disable it on nVIDIA hardware.
Now if people find these tweaks/optimizations as being tantamount to cheating then what will they say about nVIDIAs entire Maxwell/Pascal architectures using Tile based rendering? Meaning that they only render portions of the scene which can be seen. Of course, like the other aforementioned tweaks, the nVIDIA GPUs are doing less work yet you cannot tell the difference IQ wise.Edited by Mahigan - 8/18/16 at 10:10pm