Originally Posted by zealord
Is there honestly anyone thinking of sidegrading from a 780/Titan to a 290X?
Originally Posted by malmental
Price wise it's a Titan killer, performance wise it is not. And compared to the 780 it's a cheaper competitor.. That is all.
Sidegrade is correct term as per performance..
Whether you decide to accept the facts in front of you or not, the 290X flat out beats the 780. All things considered, one could argue going from a Titan to 290X is overall a downgrade because despite pumping out the same FPS in 3D games/applications, you do lose 2GB of VRAM and double precision compute performance. When talking gaming performance, a Titan to a 290X is a true sidegrade. But versus the 780, the 290X is a pure upgrade in every way, even if only a slight one.
The 290X is not
simply a cheaper competitor. It objectively beats
the GTX 780, without Mantle, for $100 less, and that's the end of the story. I apologize if it's giving you buyers remorse over buying a 780. The reality is that AMD just laid a smackdown on Nvidia.
Nvidia fanboys keep reeling for arguing points so they point to noise, temps, and power consumption. First off, power consumption means literally nothing to me
(and it usually means nothing to the people who bring it up either, as they're just looking for some ammunition). Power consumption is the weakest arguing point. Temps and noise? Okay, AMD's reference blower cooler is complete garbage, no denying that. Nvidia's 700 series reference cooler is awesome. When custom 290Xs with aftermarket coolers are released, they will most likely run at the same temperature and noise as non-reference 780s. Arguing about the temps of reference cards is kinda pointless since the people who buy reference cards are usually a minority, and most of those people usually intend to water cool the card. Same can be said for OC headroom - Custom/watercooled 290Xs will likely OC just as well as custom/watercooled 780s.
Oh yeah, and now Mantle. Dont forget there currently 16 Frostbite 3 games currently in development. All will use Mantle. It's not just a BF4 card.
Now, for the 780 Ti... I'm excited for Nvidia's response. If they can make the 780 Ti beat both the Titan an 290X in games for $650, I'll probably go for it, simply for G-sync. Since I'm staying put at 1080p, I don't exactly need the 4GB + 512-bit bus, and I'd rather have a card where the production cost was spent more on cores and architecture than the frame buffer, since 3GB + 384-bit will be more than enough at 1080p. For me, Frame Rate + high settings > high resolution.