Originally Posted by Particle
If you have infinite money, you're an exception and that's fine. I don't generally assume that to be the case when giving advice to people though since for most of us there is some degree of significance to the benefit we're receiving in return for our expenditure.
All I know is that I never had any real success with four different generations of crossfire. I had multiple GPUs from the 4000 (quad 4850), 5000 (dual 5850), 6000 (quad 6970), and 7000 (dual 270X) families. Never were the gains consistent and beneficial in the games I played. In those seemingly few situations where my framerate did actually increase versus a single GPU, frame timing issues meant there was no perceivable improvement to fluidity. It was odd getting 50-60 fps in Battlefield games for instance where the video was still choppy (pacing I mean--I don't care about screen tearing) and laggy (ie mouse movements are frustrating) enough to feel the same as when I was getting 25-30 fps on a single card. I was just throwing away money both on hardware and power for no real benefit. I saw enough other people give reports of similar experiences throughout the years to eventually give up on multi-GPU. I had held out hope for half a decade with the ever-present carrot of "it'll work perfectly on the next generation of hardware or the next major driver update, just you wait" never coming to pass.
That's my take on multi-GPU at least. Your experiences have been different from the sound of it, and I'm glad it has worked well for you in the past. I'm not sure how common that is. It's certainly not something I'd recommend to people who don't fully appreciate the potential pain they're signing up for. That is especially the case now where developer support has been on the decline for years and the future looks to be one where traditional multi-GPU on both sides will eventually cease to exist. Things may change once the major engine developers implement agnostic multi-GPU utilization on DX12 and Vulkan code paths, but I've been burned too many times by the promises of future technologies to make recommendations based on it.
I dealt with a lot of the problems that you are referring to or I believe you are referring to.
First step was plenty of power.
Second issue was tearing which vsync solved but then created an issue with not hitting full clocks ( artificial throttle ) which resulted in lower frame rates.
At that point I went and grabbed a 144 hz freesync panel which in turn allowed me to get the cards back to full clocks while solving the tearing issue as well but then created another problem.
Thermal throttling which in turn resulted in lower clocks reducing frame rates. At that point I water cooled both cards 290x with full covers that had great gpu/vrm temps.
At that point all issues were solved except for the fact that I had a 4gb memory limitation and I could only ask so much from the cards texture wise.
Fury x is no different and responds the same way with HBM being more forgiving in the eye candy department since 4gb hbm is equal to about 6gb gddr5.
As far as drivers go I have had 0 issues but of course I am not an early adopter and like OS I only use for the most part hardware that has matured.
I had 2 270x on air fyi. the memory limitation vastly ruins the x fire experience and is a rather significant bottleneck.
On a side note as despite the overall score being what only matters to most I would like to point out that despite the full dual 16x and higher core count and rather well tuned system I have yet to catch my game test results on this platform vs x370.
Basically put if you ditched x370 solely to game on x399 you made a poor decision.
If you grabbed x399 to build a workstation that is "good enough" to game on and can be an all in one system you made the right decision.Edited by chew* - 9/20/17 at 1:48pm