Originally Posted by MXjunk127
Wow @ this thread.
I am one of those "3D designer's", and I should point out that it this time Video cards have absolutely nothing to do with the rendering process (although the new version of 3ds max does infact use CUDA to do real time previews). They are simply there to help show the polygons and to navigate your viewports (which is as intensive or more than a game at times). But I digress.
By the nature of what we do, we are RAM crazy. I had 8 gigs on my last machine (which is now a part of my render farm), and I can honestly say I did not have enough for what I do (talking 20 million polygon scenes, and crazy mentalray rendering @ high resolutions). MOST 3d artists, would be perfectly happy with 8gb because it really is HARD to get past or to 8 gb of ram. 8GB is a lot, I will be honest, 12, is insane. However with (pro) multitasking, multiple instances of 3d programs, rendering, ect ect 12 also seems small.
I have built a few i7 12gb machines for coworkers, I have individually tested 1066, 1600, and now 1333 on 920's (non overclocked). Unfortunately overclocking does not fair well with rendering, so I usually play around with it, but run my machine with stock clocks when I have a job (for stability). What I have found is, 1066 ram is amazing with stock clocks, its quick all around, but only stock clocks, 1333 is great for rendering, it lacks a little when there is a lot of math to be done, but its great at rendering, 1600 is amazingly amazing in stuff like particle simulation, I am assuming its because of the bandwidth. So, for my personal build, I went with 1333, and I bet, when I do OC my machine to 1:1 it will be awesome. Reguardless, 1333 is the ram I chose.
As a footnote, the i7 920 renders somewhere around 1.8x faster than my Q6600 did, both at stock clocks, in my real world bench 3ds max file.
Blah, back to work.