Originally Posted by RussianGrimmReaper
We will eventually. The yeilds are better than 55nm by TSMC so why wouldn't we? Intel has the fabs, AMD is working on them last I heard.
Explain to me then, why is it that with every process shrink, do CPUs start producing less heat. I'm specifically talking about Conroe -> Penryn and R600 -> R670.
First off, what does the fact that Intel currently produces its processors on its proprietary 45nm Hi-K process tell you about the reduction in power consumption on future GPUs based on TSMC own 45nm process? Nothing! How can you claim "dramatic" reductions in power consumption (I think that is what you mean when you say "heat") when you have never seen the process demoed (does it exist?)?
Second of all, the transistor density of Intel's 45nm processors is higher than that of any GPU, so how does Nvidia / ATI "pack" more stuff in a GPU?
You are missing of lot of important information. You are making even more relationships based off what are trends, but not true relationships.
There is a lot more that goes into the power consumption of a processor than its process (45nm, 65nm, ect). This should be pretty obvious... not all 65nm processors produce the same amount of heat. You can even buy identically performing processors with different power consumptions.