Originally Posted by thegreatsquare
I did pretty good with selecting my desktop in 2008, I was going for a ~5 year build except for GPU and I did it for the most part. IMO 5 years is the approximate maximum for a CPU. Being almost 5 years old my Q9450 shows it's age now, but the extended shelf life on the current gen consoles permits it to remain effective enough
. I'm keeping it until Broadwell in 2014 [...or maybe Haswell-E], but may change out the current HD5970 for a HD 88xx [or Nvidia equivalent] for more VRAM and some energy saving. I usually plan GPU replacements for every 2-3 years anyway.
Most everything looks good for you to last 5 years with GPU upgrades, save maybe the PSU. If I was buying today , I would choose the 3570k as well.
This is why people (including myself) currently have such long-lasting rigs. A lot of us are either forgetting or are not old enough to have ever known that this is not normal
. As soon as "next gen" consoles are released, expect any
rig built more than 6 months earlier to become instantly obsolete (at least for gaming).
Based on rumors, that is within the next 2 years, at most.
AFTER that point... well, if
that gen of consoles lasts another 10 freaking years
, than by the time we get half way through it, a newly built rig could probably expect a similar lifespan to the ones we have today.
Then of course, there is the second reason rigs aren't getting outdated as quickly these days. Display technology has completely, 100%, ground to a halt about 6 years ago. I'm not quite sure WHY consumers are so happy with crappy 1080p 60hz TN panels, other than perhaps the fact that we all call it "FULL HD" as if there was something either "high" or "full" about it. Even today's best graphics cards struggle to push modern games at actual
high resolutions.Edited by Zero4549 - 11/10/12 at 2:02am