Originally Posted by pioneerisloud;14416770
A 3GB GTX580 wouldn't hurt a single thing for future proofing. Who's to say that another big name game next year won't utilize that much VRAM? Who's to say the OP won't buy 2 more monitors while he has that card, and SLI's it?
For a single monitor, you obviously have gone for a pair of AMD cards. That's great. However, for a problem FREE experience (micro stutter, driver issues, heat issues, games not using Crossfire / SLI, etc), I think its ALWAYS a smart idea to grab the strongest single GPU card you can buy, and then dual it up later if you feel you need to.
I agree, the choice is quite clear. Buy 2 6950's now, and have absolutely no upgrade path, and tons of issues. Or go with 1x 580, and game to your heart's content.
And have no upgrade path? With due respect, that's absurd. In effect, opting for a pair of 6950s arrives you higher-up the "upgrade path". It's as easy to verify as it is easy to navigate Google (or through links already posted throughout the thread). Strictly focusing on the factor of performance, it's an irrefutable win for the crossfire Cayman XTs.
As regarding the "problem FREE" declaration of the single GPU experience, well, it's an assertion refuted by recent releases in regard to both camps. Dragon Age 2 enjoyed a disastrous adoption by Nvidia hardware, after all. Was that a problem? Unless a driver released addressing this abysmal performance wasn't a "problem"? Following that fallacious claim (single GPUs are perfect, dual GPUs is fail) the oft repeated, ad nauseum claim of Crossfire being buggy, plagued with glitches, poorly supported, and sluggishly updated is absolute rubbish. The transgressions of previous generations have apparently marched right along with the Cayman architecture, in spite of the widespread & consistent reporting to the contrary! Just ask a GTX 580 user.
Other sources, however, (in correlation of my extensive hands on experience) testify that the Crossfire system was extensively redesigned back in Catalyst 10.2 & was darn near perfected alongside Cayman's release.
Originally Posted by Dave Baumann
Crossfire was rearchitected some time ago (in fact, the pretty cool scaling you see now on high end GPU's owes a lot to that rearchitecture):http://www.pcper.com/reviews/Graphic...-display-usershttp://news.softpedia.com/news/AMD-C...w-135289.shtml
Another critical update that will be enabled in the Catalyst 10.2 is the rearchitecture of CrossFire. According to the chip maker, the new drive will see some of the CrossFire code moved from the 3D drive to a separate driver component, in an attempt to prepare the Catalyst Software Suite for future AMD products, namely the much-anticipated Fusion products, such as the 2011-bound 'Llano' APU. In addition, the separate drive will also enable users to mix and match ATI graphics cards from different generations.
I own over 100 titles, & virtually every "C" & higher release over the last two years. I run games in 3x1 & 5x1 portrait Eyefinity mode with four Cayman Pro GPUs (on two PCBs). Last generation, I ran dual 5970s for weeks in freshly minted ATI Eyefinity..before selling it in disgust at a hefty loss. Replacement with a triple 5870 worked to salvage the filthy taste of excrement from my palette; improved as it was, micro-stutter, incompatibility, poor scaling, et al. were side-effects with which I was, unfortunately, quite familiar. 1 of every 4 (or 5) titles suffered from some negative factor aforementioned, though time (with AMD's adoption of CAPs, and various optimizations otherwise) saw gradual improvement.
This generation is a new
generation. I've encountered micro-stutter twice; both in canned benchmark testing, and both being alleviated within months via CAP & driver updates. I've had game-issues twice, the first of which being Crysis 2. The glitchy behavior was met head-on by the AMD driver department with a flurry of CAPs that, after a few tries, sorted out the flicker & the poor scaling in DX9 - these CAPs were released within days
of the release, & fixed in entirety within weeks. Granted, with the 1.9 Crysis 2 "patch", performance has been difficult to sort in such a hasty fashion. I'd put the blame squarely on Crytek, really, considering the bizarre "tessellate the ocean in every frame despite not seeing water" rendering choice of the engine. With the current driver, nevertheless, I can run everything @ Ultra detail in 6.8 megapixels.
The other title, Witcher 2, saw the problem remedied in weeks & was absolutely freaking worth it. It takes some real brass to crank that game up maximally (no single GPU will do, really) & taking a week or two to coordinate the driver is a small sacrifice when taking account of the result.
I play virtually everything with maximum IQ, glitch free, enjoying brilliant scaling @ all resolutions (1, 3x1, & 5x1 alike), with an average of 3 or 4 CAPs released every single month. It's not the same as ye old 4870x2, folks. The scaling, compatibility, & support is freaking unreal..whether with 2 gpus, all the way up to 4. I say the last bit to address the "no upgrade path" notion once more. Of course there is. And unlike previous imperfect iterations, further upgrades are anything but a waste.