Originally Posted by 47 Knucklehead
My GTX 580's were the 3GB version. They were (and are, I recently moved them over to my wife's rig ... "Switched Switch"
and are the EVGA Hydro Copper 2 card ... they came with built in water blocks, so yes, they are cool. They were very powerful, just one of them pretty much performed better than my 2 4870x2's that they replaced. The 4870x2 used 350 watts vs 244 for the GTX 580, and the GTX 580 has a pixel rate of 37,056 Mpixels vs the 4870x2's 24,000 ... again, nearly double. So going from 2 HD 4870x2's that sucked up 700 watts of power ... regardless of if you are under water or not ... and did 48,000 Mpixels a second and took up 4 slots to 1 card that took up only 1 slot, only consumed 244 watts of power, and did 37,000 Mpixels was pretty logical. Not only did my electric meter spin more slowly, my room was much cooler, and I nearly got the same performance with ONE GTX 580. 18 months later, I added a second card. Not to mention, at the time (and still to this day), the GTX 580 absolutely SMOKED the 4870x2 when doing Folding@Home.
Yes, I know about the HD 6870. I used to own the HD 6870 but sold it when I converted my "The Betty"
build back over to air for my daughter, then gave it to her.
Even after my recent "down sizing" of computers, etc. I still own an AMD 6670 video card and it's in 24/7 operation in my "Bit Bucket"
build. That is a very good card, even overkill for what I use it for, but because unlike many of the AMD cards as of late, it has a low TDP (wattage) and thus you can dump the heat it does generate without ANY fan ... which is what I wanted for a totally silent system.
As far as me saying ...
I have no idea where they are getting that I said that in this thread.
Prior to today, the only two posts I made on this thread are ...
So anyone who said I made a post that even used the words "These amd dual gpu cards suck." in it is a bald faced LIAR.
End of story.