Originally Posted by Choggs396
+1 Rep 4u... I don't care what anyone else says, for the most part the above statement IS true. I have experienced this myself...
I have an AGP system: XP 2500+ / A7N8X-E deluxe mobo / 9600XT / 1GB DDR400 Corsair XMS. I received a VisionTek Radeon X1950Pro AGP 256MB AGP card as a Christmas gift, and said to myself "sweeeet!". Well, turns out it didn't do much compared 3+ year old Radeon 9600XT.
The card was horribly bottlenecked... my MAX frame rate went up a little in some games like HL2/CSS, Doom 3, Quake 4, etc. to around 60-70 FPS in fairly high detail, but the rates were typically around 10-20 most of the time - just like with the 9600XT (max detail also.) So basically I had to set the detail to med-low @ low resolutions just like the old card to make the games playable.
I ended up selling the card on Ebay, and obviously took a price hit being that it was from Best Buy (rip-off city). I felt bad selling a gift from my parents, but it was seriously a waste of money.
Long story short - If you are into newer games, AGP is simply not cost effective. It may cost more to do a complete upgrade or buy a new system, but at least the money is not being wasted.
From my experience this isn't true at all. I went fro a 9600XT to a 7800 and have seen a HUGE difference in frame rates. Now, it isn't as big of jump from the XT to a X1950 but still enough to make a difference.
A GPU is still rendering the same image just smaller at 16" thus using less power but it doesn't shift the rendering to CPU.. So saying 16" uses less GPU and more CPU is wrong.
Talking about the size of the monitor doesn't make a bit of difference. It is the resolution that matters. Typically the 20" would have a much larger resolution and therefore using more GPU, CPU useage would stay about the same