Overclock.net banner

AMD vs Nvidia Image Quality

3.8K views 21 replies 11 participants last post by  rdr09  
#1 ·
I've never been a fanboy.. I will buy whichever card I think is the better deal. However, It's been a long time since I've had an Nvidia card. Currently I'm running a 6850 and I thought it might be fun to get an Nvidia card next...

But I stumbled across some thread where several people where claiming after they switched to Nvidia, they noticed the cards didn't display a rich vibrant color like AMD cards. That they looked dull and flat even after adjusting things in the control panel.

Is this true? Lets forget processing capabilities for a second. How do the 2 compare in actual image quality? I haven't seen an Nvidia card running in so long, I just don't know what to expect out of one.

For the amount I'm willing to spend it seems I am looking at getting a 380X 4GB or a 960 4GB.. considering what I read in that thread and people stating just the 380 alone compares to the 960 I think I am likely getting the 380X but I wanted to hear peoples opinions.
 
#4 ·
Quote:
Originally Posted by PontiacGTX View Post

why woudl you get a 960 over a 380/X? if its slower?

also there are better choice for used cards like r9 290s
I already said I am likely to get the 380X for that very reason. However, I wanted to hear from experience how the cards compare visually. Also there are differences in what the cards are capable of, such as that Nvidia AA, game effects or whatever and that resolution thing.

And a used DX 11 card is not better to me than a new DX 12 card.
 
#5 ·
Quote:
Originally Posted by defhed View Post

Also there are differences in what the cards are capable of, such as that Nvidia AA,
MFAA isnt better than MSAA or even SMAA
Quote:
game effects or whatever and that resolution thing.
physx is the only thing amd cards cant do,amd cards support most gameworks features(for a reason nvidia did it),and amd also support downsampling
Quote:
And a used DX 11 card is not better to me than a new DX 12 card.
GCN already supports DX12.it isnt VLIW...
 
#6 ·
The less vibrant color thing with Nvidia is a complete non-issue. That rumor is based on the fact that Nvidia for the longest time defaulted to limited RGB format. Both AMD and Nvidia output the exact same colors when told to use full RGB range.

But there have been some differencies in anisotropic filtering quality between the two. AMD used to be behind in that department during the 6000 series times but they improved it when moving to GCN and some say they made some further improvements with Fiji. Personally I have never seen the AF difference either, it's something you really have to know how to look for.

In summary: IQ is the same for all practical purposes.
 
#7 ·
Quote:
Originally Posted by specopsFI View Post

The less vibrant color thing with Nvidia is a complete non-issue. That rumor is based on the fact that Nvidia for the longest time defaulted to limited RGB format. Both AMD and Nvidia output the exact same colors when told to use full RGB range.
The limited range setting for Nvidia got fixed in 347.07, afaik.
Quote:
Originally Posted by specopsFI View Post

But there have been some differencies in anisotropic filtering quality between the two. AMD used to be behind in that department during the 6000 series times but they improved it when moving to GCN and some say they made some further improvements with Fiji. Personally I have never seen the AF difference either, it's something you really have to know how to look for.

In summary: IQ is the same for all practical purposes.
Excuse me? AMD's AF is angle independent. You cannot say the same for Nvidia(insofar as I've heard). The issues you recall was related to the default Nvidia "LOD clamp" setting that was missing in HD6000 series and got added into the GCN series drivers. You could literally go and change the LOD according to your AA setting and not have those issues. While I have yet to see any topic involving AF end in the favour of Nvidia.
 
#8 ·
Quote:
Originally Posted by specopsFI View Post

The less vibrant color thing with Nvidia is a complete non-issue. That rumor is based on the fact that Nvidia for the longest time defaulted to limited RGB format. Both AMD and Nvidia output the exact same colors when told to use full RGB range.

But there have been some differencies in anisotropic filtering quality between the two. AMD used to be behind in that department during the 6000 series times but they improved it when moving to GCN and some say they made some further improvements with Fiji. Personally I have never seen the AF difference either, it's something you really have to know how to look for.

In summary: IQ is the same for all practical purposes.
Limited RGB was only a issue on HDMI IIRC, and that's if you don't change the default setting.

Using DVI/VGA/DP never had the issue.
 
#9 ·
Quote:
Originally Posted by mtcn77 View Post

The limited range setting for Nvidia got fixed in 347.07, afaik.
Excuse me? AMD's AF is angle independent. You cannot say the same for Nvidia(insofar as I've heard). The issues you recall was related to the default Nvidia "LOD clamp" setting that was missing in HD6000 series and got added into the GCN series drivers. You could literally go and change the LOD according to your AA setting and not have those issues. While I have yet to see any topic involving AF end in the favour of Nvidia.
Please don't quote me, you are happily on my block list as I have no interest in arguing you in anything. Even now I'm not following your argument since I didn't make any claims of AMD's AF not being angle independent or Nvidia being better. But if you want to see someone saying Nvidia was better compared to AMD's 6000 series, fine. There you go. And again, please don't quote me anymore. I can almost never understand the point you try to make in your posts, besides always being against Nvidia and pro AMD.
 
#10 ·
Quote:
Originally Posted by specopsFI View Post

Please don't quote me, you are happily on my block list as I have no interest in arguing you in anything. Even now I'm not following your argument since I didn't make any claims of AMD's AF not being angle independent or Nvidia being better. But if you want to see someone saying Nvidia was better compared to AMD's 6000 series, fine. There you go. And again, please don't quote me anymore. I can almost never understand the point you try to make in your posts, besides always being against Nvidia and pro AMD.
You said AMD was behind, how is that not saying Nvidia was better?
 
#11 ·
Quote:
Originally Posted by dmasteR View Post

Limited RGB was only a issue on HDMI IIRC, and that's if you don't change the default setting.

Using DVI/VGA/DP never had the issue.
Thats true, but the whole urban legend is based on exactly that: defaulting to limited RGB on HDMI. It was a stupid thing to do by Nvidia and took them far too long to fix but it was never a problem for most people.
 
#12 ·
Quote:
Originally Posted by mtcn77 View Post

You said AMD was behind, how is that not saying Nvidia was better?
Read again:

"AMD used to be behind in that department during the 6000 series times but they improved it when moving to GCN and some say they made some further improvements with Fiji".

Oh, and: "In summary: IQ is the same for all practical purposes."

Now please, move along to argue with someone else.
 
#13 ·
Quote:
Originally Posted by specopsFI View Post

Read again:

"AMD used to be behind in that department during the 6000 series times but they improved it when moving to GCN and some say they made some further improvements with Fiji".

Oh, and: "In summary: IQ is the same for all practical purposes."

Now please, move along to argue with someone else.
Do you know the meaning of "Mipmap LOD"?
 
#15 ·
you can change colors as vibrant as you like in nvcp. That being said, get the 380X, 960 is a pointlss card imo, even the 4GB one.
 
#17 ·
For clearance, I never agreed on the contemporary IQ levels being equal. Nvidia dropped CSAA on Maxwell series. Now, what was the tune back when AMD hadn't enabled it through the driver?
Quote:
AMD and NVIDIA have traditionally kept parity with AA modes, with both implementing DX9 SSAA with the previous generation of GPUs, and AMD catching up to NVIDIA by implementing Enhanced Quality AA (their version of NVIDIA's CSAA) with Cayman.
So, if AMD caught up and Nvidia has dropped its support for CSAA, that means Nvidia has fallen behind, doesn't it?
 
#18 ·
Quote:
Originally Posted by PontiacGTX View Post

GCN already supports DX12.it isnt VLIW...
I don't know what those acronyms stand for.. but even if the 290s support DX 12, I am going straight out on Fri when I get my next check and buying a new card to play Tomb Raider.. Even though I wouldn't mind buying used for a faster card, I doubt if there's just going to be one up for sale around me Fri and it's fun to buy new. The 380X will already murder my current card.
 
#19 ·
Quote:
Originally Posted by defhed View Post

I've never been a fanboy.. I will buy whichever card I think is the better deal. However, It's been a long time since I've had an Nvidia card. Currently I'm running a 6850 and I thought it might be fun to get an Nvidia card next...

But I stumbled across some thread where several people where claiming after they switched to Nvidia, they noticed the cards didn't display a rich vibrant color like AMD cards. That they looked dull and flat even after adjusting things in the control panel.

Is this true? Lets forget processing capabilities for a second. How do the 2 compare in actual image quality? I haven't seen an Nvidia card running in so long, I just don't know what to expect out of one.

For the amount I'm willing to spend it seems I am looking at getting a 380X 4GB or a 960 4GB.. considering what I read in that thread and people stating just the 380 alone compares to the 960 I think I am likely getting the 380X but I wanted to hear peoples opinions.
No. There should be minimal difference between the video cards. The D3D and OpenGl specification specify precision. For D3D, it's within a few ULP of 32 bit. Thus, while the images won't be identical they should be below a threshold which is visible. Additionally, sRGB curve is well defined so the output to the display should also have minimal variance. In short, color standards and API precision standards are not only well established, but actively verified by companies like Microsoft.

We've actually used both NV and AMD cards simultaneously alternating frames and the delta between the images is so small as to not be noticeable.
 
#21 ·
Quote:
Originally Posted by specopsFI View Post

Thats true, but the whole urban legend is based on exactly that: defaulting to limited RGB on HDMI. It was a stupid thing to do by Nvidia and took them far too long to fix but it was never a problem for most people.
There was even bit of time after they "fixed it" that it didn't give full rgb. I forget who but an artist I follow was complaining that he had to do a registry edit to fix it. IDK the case these days but I assume it's been fixed completely (I hope)