Yup, dont try to defend without arms
Originally Posted by Sideburns
Uh...sorry bud, but no.
Uhhhhh......sorry bud, but no.
More/less ati has had the performance/$ for some time. On top of that the only reason our lovely nvidia cards are doing so good in some benches, its because Nvidia cheats and sets the texture slider to quality instead of high quality. Here is a good comparison of exactly what you are getting at the same specs for glittery things that ATI delivers:
8. Hello guys, my first time posting on Xbit forums...
Here is a serious quality issue concerned here: The settings used for testing of Nvidia cards give it anywhere from 10% to 40% performance boost--namely the 'Quality' image setting. This default setting requires that Trilinear optimization and Anisotropic sample optimization are on. This degrades the image quality to unacceptable levels that cannot be seen on ATI cards--which is the flickering of mipmaps due to excessive trilinear and AF optimizations (heavy texture shimmering that cannot be seen in captured snapshots--only in motion). When turning these optimizations off, it can degrade performance by as much as 40% in UT2004.
Nvidia's default Quality setting offers much worse texture filtering quality than ATI's default Catalyst AI Normal, Quality setting.
For more of apples-to-apples comparison, please use High Quality for testing Nvidia cards while forcing Full Trilinear Filtering to be on in ATI's drivers (although only for Direct3D games). This is as close to exact match as it can be between Nvidia and ATI in image quality.
For ensuring that Full Trilinear Filtering is on in OpenGL games with ATI cards, simply enable HQ Anisostropic Filtering. ATI's High Quality AF setting forces full trilinear due to the implementation of such maximum AF algorithm. If you feel that enbaling High Quality in Nvidia's drivers would give ATI an unfair advantage, then simply enable ATI's High Quality AF while keeping the Catalyst AI at normal. There is no way to turn off Nvidia's hidden optimizations for certain games, so as you already know, ATI's Catalyst AI should not be turned off either. Keep in mind that the AF setting should be left in the Quality setting, instead of Performance, so that trilinear filtering will be allowed.
Why give Nvidia a 10%-40% advantage with its default Quality setting that offers much worse shimmering that cannot be done on ATI's cards, if you can give ATI a measly 2%-5% advantage by changing Nvidia's setting to High Quality, and leaving ATI's in default? If you are really on Nvidia's side, or if Nvidia has "bribed" you, then simply give Nvidia a 5%-15% advantage by turning on ATI's High Quality AF while keeping Nvidia's setting on High Quality--that way, you can be ensured that FULL trilinear is implemented in virtually all of the filtered textures for both ATI and Nvidia. That would be unfair to ATI, since ATI would be offering unprecedented IQ over Nvidia's.
The solution, as you and I know it, is to change Nvidia's setting to High Quality while leaving ATI's at default (and forcing full trilinear on in the registry or with ATITool program for Direct3D games). This will remove ATI's well-known advantage in some Direct3D games by up to 5%, BUT removing Nvidia's shimmering optimizations will penalize the GeForces by up to 40% in games that showcase lots of distant textures.
The evidence is here, although by ATI:
It does not mention UT2004, which does indeed show a 40% penalty, as tested by independent users.
For the videos on Nvidia's shimmering, check:
Even enabling the two hidden optimizations in ATI's drivers (Trilinear Optimizations and AF Optimizations) AND setting Catalyst AI to High still does not cause the shimmering effects of Nvidia's undersampled AF in Quality setting.
If you, Xbit Labs, do not agree with me, please rest assured that 3DCenter of Germany has the evidence in videos along with the suggestion: "All benchmarks using the standard setting for NV40 and G70 against the Radeon are invalid, because the Nvidia cards are using general undersampling which can (and does) result in texture shimmering."
It means that only the benchmarks using High Quality setting for Nvidia cards are valid, within the +/-5% margin instead of the 15-40% margin concerning such unfair (and ugly) advantage over ATI.
Here are the charts.
I had a thurough discussion to another person on this board about this, but I would rather not bring that up. In any case offtopic from main, if he/she is watching please excuse my friends host. It deleted the zip 2 times because it scanned it as a virus, leaving the 40 mb video there. In this link you can see the shimmering I was talking about.