Originally Posted by jellis142
160 SP's on 27w. I'm sold.
I'd take it back and pay the restocking fee.
The 5550 with gddr5 is one h e double ll of a card on 35 watts.
That's not to say that these cards are not a step in the right direction. 160sp's is the bottom floor. At least they have given up on the 80sp trash.
But the 5550, 5570, and 5670 displays videos, runs most web browser stuff and does 90 percent of it's tasks at it's 2d clocks. If amd would remove their overclocking limits in catalyst these cards would chow 95 percent of modern games at max settings instead of only 93 percent of them.
And a g92 9800gt can bring twice the texture fill rate to the party but you only need it if you are up in the 2560x1600 monitor sizes or possibly the 1920x1200.
Honestly the low end 4 series except for the 4670 was total trash. Now nvidia's low end series is total trash. The GT430 has way way too much shader and NO PIPELINE. It can't move textures to save it's life. The GT220 and 6450 are still fairly stupid and would make more sense at 28nm.
If you could set voltages and clock speeds without being over ridden by the driver you could get a 5550 or 5570 to beat it in power consumption by a mile.
I just see it as a 3 card solution. 5550 for 1600x900 and 1440x900 montors, 5670 for 1920x1080 monitors and 5770 for 1920x1200, 2560x1400, 2560x1600.
Now the 6 series uncores and runs shaders much faster. The shaders wipe dx 9 and 10 stuff without any trouble at all. Even uncored and running nearly twice as fast though they aren't big enough for anything but 10.1 style minimal dx 11 tesselation. So under a full blown dX11 mature world the 6450 is just as bad as trying to use a 5450 for games right now.
Until they raise entry level to 240sp's it's still just a cut the cake game. 240 480 960 1920 should be the cake cuts, not 160 320 400 720 800 1120 1440 1600 3200.