Overclock.net banner

1 - 20 of 23 Posts

·
Registered
Joined
·
1,002 Posts
Discussion Starter #1 (Edited)
I'm getting sick of dealing with this GTX 760 2GB. It's a terrible clocker and it's vram limited in mostly everything. I would like to spend at most $300 (+/- $25 for shipping etc) on a decent GPU. It's going in my sig rig, so it'll be bottlenecked a little for the time being, but I plan on upgrading the MOBO/CPU/RAM by the end of the year.

What would OCN recommend? I'm not bias on branding, AMD or NVIDIA it'll run either.

Rig:

i7 2600 (Non K, +4 bin OC)
4x4 1866mhz
P8P67 Pro Rev 3.1
GTX 760 2GB
Windows 10
Corsair CX650M PSU
 

·
9 Cans of Ravioli
Joined
·
19,788 Posts
It's going in my sig rig
you see, the new fantastic OCN doesn't have that :rolleyes:

GTX 1060 6GB is a good buy. RX 580 is actually starting to come down in price (but still not at MSRP) and depending on the game you look at, is either a bit slower or a bit faster than the 1060.
 

·
Registered
Joined
·
1,002 Posts
Discussion Starter #3
you see, the new fantastic OCN doesn't have that :rolleyes:

GTX 1060 6GB is a good buy. RX 580 is actually starting to come down in price (but still not at MSRP) and depending on the game you look at, is either a bit slower or a bit faster than the 1060.
Yeah I just saw that... I haven't logged in for awhile now. That sucks. I updated.
 

·
Avid Memer
Joined
·
5,953 Posts
The GTX 1060 6GB is going to fit your situation the best. Your i7-2600 also won't be much of a bottleneck.
 

·
The Challenger
Joined
·
2,279 Posts
RX-580, if you're planning to upgrade your monitor anytime soon, considering FreeSync. Or RX-570/480/470. Whichever you can get your hands on at a decent price.
 

·
Avid Memer
Joined
·
5,953 Posts
G-Sync monitors also exist. There was no mention of a monitor upgrade in the original post so I think it's safe to assume there isn't one in the plans, at least not for this year.
 

·
The Challenger
Joined
·
2,279 Posts
G-Sync monitors also exist. There was no mention of a monitor upgrade in the original post so I think it's safe to assume there isn't one in the plans, at least not for this year.
Considering that G-sync monitors are on average ~$200 more expensive, you're better off buying a $500 GPU rather than a $300 one and skipping the variable framerate monitor.

Say what you will, but even TVs are adopting FreeSync now. There is nothing wrong with putting this forward as an option, and let the thread starter decide. The RX 580 and GTX 1060 6GB are practically equal in performance on average anyway. Also, considering the age of his card, he likely tends to keep his cards around for longer, which is another argument for going for AMD rather than nVidia, considering AMD is obviously superior at the newer APIs like Vulkan and DX12.

What is it with people trying to censor AMD options? The mind share is real.
 

·
Avid Memer
Joined
·
5,953 Posts
Considering that G-sync monitors are on average ~$200 more expensive, you're better off buying a $500 GPU rather than a $300 one and skipping the variable framerate monitor.

Say what you will, but even TVs are adopting FreeSync now. There is nothing wrong with putting this forward as an option, and let the thread starter decide. The RX 580 and GTX 1060 6GB are practically equal in performance on average anyway. Also, considering the age of his card, he likely tends to keep his cards around for longer, which is another argument for going for AMD rather than nVidia, considering AMD is obviously superior at the newer APIs like Vulkan and DX12.

What is it with people trying to censor AMD options? The mind share is real.
I'm not trying to censor anything. You brought up AMD's adaptive refresh rate technology and casually omitted the Nvidia equivalent. Better performance with Vulkan or DX12 only matters if the games you play use those APIs. Currently those are the exception and not the rule. I will almost always advocate for the video card that provides the best performance per watt.
 

·
The Challenger
Joined
·
2,279 Posts
Performance/watt is not the be-all and end-all. It should be on the desire of the thread starter, not your own.

If all he cares about it performance per watt, the GTX 1060 6GB is the better choice. Considering his PSU, a 50W+ higher power consumption means nothing. So unless price is an issue (RX 580 has been a lot higher), there really shouldn't be a bias between one or the other.

I'll let this video do the talking...

 

·
Avid Memer
Joined
·
5,953 Posts
I'm sorry if you think I have a prejudice against AMD/ATI. I do not. I guess I will have to reiterate what I said and break it down for you since you didn't understand me the first time.

I will almost always advocate for the video card that provides the best performance per watt.

In case you missed it, I don't always go with the most efficient card. If someone already owns a FreeSync monitor, I would be more inclined to suggest an AMD card. If someone's monitor lack adaptive refresh rate technology, then I don't see much reason to go with a card that's going to consume more power and produce more heat while delivering equivalent performance. If someone plays games where one card consistently outperforms the other at a given resolution, I would recommend that card.

Now revisit the original post. There is no talk of a monitor with adaptive refresh rate technology. There's no talk of primarily playing games that utilize Vulkan or DX12. I personally don't think it's safe to assume someone is playing games utilizing these APIs when they represent such a small portion of the games available on the market today.

As you said yourself, the RX 580 has been priced higher than the GTX 1060 6GB. If the two were similar in price, then flip a coin. I will still have a preference for the one that has better performance per watt.
 

·
Registered
Joined
·
1,002 Posts
Discussion Starter #13
Thanks everyone!

I should have added, but I'm not too concerned about power usage. I don't pay much per kWh, so it's not a deciding factor. Don't get me wrong tho, If I can save power (and heat) I'm all for it. But I would take performance over power usage in this case.

The RX580 having 8GB does tempt me, but I have a feeling it's going to be like the old days, where you would run out of GPU oomph before you ever used all the vram. But it does seem like a lot of newer titles love the vram.

As far as the monitor (or TV in my case lol). I don't have any plans to upgrade yet. I'm pretty much at GPU > CPU/MOBO/RAM > Monitor as of right now.
 

·
Registered
Joined
·
2,302 Posts
Are you open to used GPUs? If you're okay with used you'll have a, lot more options as nice 6GB 1060s are still 300+.

A used 570/580 would be faster, but you'll need to be sure you buy it from a reputable source as it may have been horsed previously.
 

·
The Challenger
Joined
·
2,279 Posts
I'm sorry if you think I have a prejudice against AMD/ATI. I do not. I guess I will have to reiterate what I said and break it down for you since you didn't understand me the first time.

I will almost always advocate for the video card that provides the best performance per watt.

In case you missed it, I don't always go with the most efficient card. If someone already owns a FreeSync monitor, I would be more inclined to suggest an AMD card. If someone's monitor lack adaptive refresh rate technology, then I don't see much reason to go with a card that's going to consume more power and produce more heat while delivering equivalent performance. If someone plays games where one card consistently outperforms the other at a given resolution, I would recommend that card.

Now revisit the original post. There is no talk of a monitor with adaptive refresh rate technology. There's no talk of primarily playing games that utilize Vulkan or DX12. I personally don't think it's safe to assume someone is playing games utilizing these APIs when they represent such a small portion of the games available on the market today.

As you said yourself, the RX 580 has been priced higher than the GTX 1060 6GB. If the two were similar in price, then flip a coin. I will still have a preference for the one that has better performance per watt.
Fair enough.
The point regarding the newer APIs, the video I posted already shows the gap between the GTX 1060 and the RX 480 narrowing. It's not only newer APIs, but the way games are programmed. It is more likely that the AMD cards will have a longer life in them than the nVidia cards in terms of long term performance. If the thread starter upgrades every 2-3 years, this doesn't matter, but if it is every 5+ years, the AMD card is the safer bet performance wise. Branding will determine reliability, but that's a whole other story.

As for there being no talk in the original post about a monitor with adaptive refresh rate, there doesn't need to be, because maybe not everyone is aware of it. That is why I viewed your post as censoring. Being refrained to talk from something that wasn't mentioned in the first place doesn't make sense. We can't assume that everyone that asks a question knows all about the current market. In fact, if they knew, they likely wouldn't be asking what to go for in the first place. So if they don't mention something, it's MORE a reason to mention certain things that weren't mentioned, because it might be something they would be interested in if they knew about it.

Thanks everyone!

I should have added, but I'm not too concerned about power usage. I don't pay much per kWh, so it's not a deciding factor. Don't get me wrong tho, If I can save power (and heat) I'm all for it. But I would take performance over power usage in this case.

The RX580 having 8GB does tempt me, but I have a feeling it's going to be like the old days, where you would run out of GPU oomph before you ever used all the vram. But it does seem like a lot of newer titles love the vram.

As far as the monitor (or TV in my case lol). I don't have any plans to upgrade yet. I'm pretty much at GPU > CPU/MOBO/RAM > Monitor as of right now.
6GB is generally enough at this point in time. Additional RAM helps with minimum framerates, but you won't really see a difference between 6GB and the 8GB. Are you planning to be using the same card you buy now for the monitor you're upgrading to? What timeframe is this? The 'slower' your card becomes over time for the modern games, the more important a feature like FreeSync becomes.
 

·
Avid Memer
Joined
·
5,953 Posts
The only thing that would concern me with lifespan of a Nvidia card is driver support. AMD keeps rebranding their GPUs, but they typically have slower driver revisions.
 

·
Registered
Joined
·
1,233 Posts
As someone that just switched from a Asus ROG Strix 1060 6GB to a Asus ROG Strix RX580 8GB, AMD is better hands down. The out the box experience is night and day.The color is better and the text is much crisper. Yeah you could calibrate the GTX for the same thing but just like G Sync its costing you more out the door if you don't the equipment. I believe after driver updates and refinements the RX 580 may have the GTX beat now. Also I will add if cooling is a top priority the 1060 6GB is a monster. I don't know if other AIBs have this feature but under 50c or so the fans are off. This card was passively cooled most of its life and rarely did the fans kick in even during gaming. Cool as a iceberg.
 

Attachments

·
Robotic Chemist
Joined
·
3,006 Posts
The color is better and the text is much crisper.
I hear this from time to time. It doesn't make any sense to me, does AMD do some color processing and text sharpening in drivers or something? Are you sure it isn't placebo?

Nvidia used to render games using lower bit depth color internally, so ATI did look better, but it has been many generations since this was true. I think the Internet is keeping this past truth alive as a myth about differences between their current GPUs.

None of my tests using sharpness test patterns and objective measurements (i1 Pro 2 and i1 Display Pro with Calman) have shown sharpness or color differences.
 

·
Registered
Joined
·
1,233 Posts
Before you touch any settings in the gpu menu, monitor or windows; AMD uses a brighter more saturated default template. It makes everything look like its amazing and fresh out of the Willy Wonka candy factory. Nvidia used a un-brightened, muted color default. I can't explain the text. To me even after adjusting in Windows the font and scaling on Nvidia, it was never very pleasant to the eye, AMD's text is just very crisp and very readable at default.
 

·
Robotic Chemist
Joined
·
3,006 Posts
You mean when Photoshop asks an AMD GPU to output the RGB values 255-160-122 the monitor does not receive 255-160-122? That would not be good.

Also, this does not appear to be true based on my testing but I do turn off all driver video "enhancements" as a matter of course, with any GPU.

About text, have you ever tried to do a more rigorous comparison, like with a video sharpness test pattern? You think AMD is running a sharpening shader on the GPU output or something?! Text rendering is done by Windows (ClearType) which generates real, exact, RGB values for every pixel. Using AMD or Nvidia GPUs does not change how ClearType works.

Video on a computer is not a nebulous thing. Each pixel has a specific set of values that should be sent via the HDMI or DP cable. You can plug the cable into a capture card and measure what they are. Arbitrarily messing with the pixels' RGB values is not acceptable for anyone who does any professional work with images. AMD is not going to do something that breaks a Photoshop workflow on their GPUs, and neither is Nvidia.

There is a robust HTPC community that is very particular and studies different options very closely, sort of like audiophiles but for digital video, and this AMD > Nvidia for image quality is not a thing in those circles. AMD is not worse either, they both generate the same output when the video playback chain is well controlled.

Picking an AMD GPU because they default the driver "saturation" adjustment a bit higher seems rather pointless to me (especially since you should really turn all that crap off anyway).
 
1 - 20 of 23 Posts
Top