Originally Posted by Alwrath
So did some testing at 4K with my Radeon 290 at 1100 mhz 6 ghz ddr5 ram overclock, was getting mid 50's to 60 FPS with drops into the 40's. Was only using a little over 3GB of ram on the card. How you ask? Vulkan, all settings ultra except : shadows low, virtual page texturing high, depth of field off, depth of field anti aliasing off, motion blur on low ( who uses this crap anyway? ), self shadow off, film grain off.
Biggest difference is shadows, every time... shadows are absolutely the most demanding effect to render in most titles, turn it off, and watch your old gpu become a super hero. Who needs a Geforce 1080 anyway? ( haha im joking )
This is great for you! A lot of users don't realize that a lot of modern graphics effects simply consume resources while offering little to no improvement in image quality that is noticeable. Maybe it's because I'm getting older and my vision is bad but I can't notice the difference between having a lot of these options on or off, other than a lower framerate and more heat. I don't run 4K but I run 5760x1080 Eyefinity, which is only 2 million pixels less. I have two Fury's, which is quite a bit more powerful than a single R9 290, but Doom doesn't support Crossfire and neither do most other recent games. Even in games that DO support Crossfire, I still turn quite a bit of options off, which improves frame rate (particularly minimums) and puts less stress on the cards, and in turn, produces less heat.
I usually turn all post processing off, set shadows to medium or low (depending on game), turn off SSAO, turn off motion blur, and use FXAA. In addition, I disable tessellation, turn down AF, and turn on V-Sync in the drivers and in the game. To me, the games still look great and they run much more fluid with less heat.
Try playing around with V-Sync, Power Efficiency mode (aka Radeon Chill), Frame Rate Limit Control, and Frame Pacing. I turn all of these on, and still maintain 60fps in every game I have with the cards running very cool with low fan speeds. In both Tomb Raider titles I maintain 60fps, no stutter, and my cards top out at 45C and 40C
respectively (and they're in Crossfire!). Do note, however, in some games you will need Power Efficiency OFF or you'll get bad stutter, Dragon Age Inquisition seems to be the worst offender. Hope these tips help.
Originally Posted by unseen0
Lately there has been a lot of stir on the forums when it comes to Nvidia's shenanigans with releasing slightly faster cards in a short period of time. Some user feel as if they are being milked.
At the same time, AMD's getting roasted for "only" having an RX480/580 as their best option.
In reality, if you turn down a few settings, an RX580 is gonna do great, even at 4k. And you won't even notice the difference in graphical quality.
When you are emerged in gameplay, you're not really gonna be paying attention to how amazing the AA filter looks on the shadows in the distance, if you do, then that kinda says more about you. or how boring the game is.
So do we really need
that 1080ti or upcoming 2080/ti? No, we actually don't. We're (not me, us in general) are just so spoiled with 4k, 144hz and high frame rates that we can't bare to go back.
Personally, i'm glad i opted out for getting the high end stuff. I simply don't see the point in dropping thousands of dollars for hardware that will add 5% quality to my gaming experience, opposed to a $700 build.
Sorry if this was a bit off topic. Just wanted to share my thoughts as your initial post shows that you don't need to throw tons of money at your system to be able to game properly.
EDIT - Just wanted to add, i am in no way meaning to critisize those who do spend a lot on their system. Nor am i saying it's bad/stupid or similar. To each their own!
Actually I would have to disagree, right now a used R9 Fury is the best option if you want to go AMD and want to do a high resolution. You can get one of my cards (with a supremely excellent cooler) for about $225 USD on Ebay. It is still very much faster than the RX 480 or 580, particularly at resolutions higher than 1440p.
I paid $600 for my Crossfire pair at the end of last year (they were new in box) when at the time, the GTX 1080 was $700. And certainly, (any not-braindead rabid Nvidia fan) anyone would tell you that a pair of Fury's smoke a GTX 1080, at least in games that support Crossfire.
With the recent driver improvements from AMD squeezing yet more performance out of these cards, I am anywhere from 80% to 95% close to an overclocked GTX 1080ti in many benchmarks. In rare games and cases, the performance is actually better. For far less money.
I'm happy with my setup now and don't plan on upgrading til 2019; though I have a long backlog of older titles I need to play, that all run flawlessly at 60fps ("maxed out" if I so choose) on my setup. So until this thing can't play some recent game I really want to play (and this is rare because I'm a hipster more or less and don't play many popular games and don't play online), I am satisfied. I am pretty certain I will run into VRAM issues with my Fury's (4GB
) long before the actual chips become irrelevant due to lack of processing resources. If the Fury had 8GB GDDR5 instead of 4GB HBM, they would remain relevant for years to come.