Shouldn't it not matter? I always thought that the CPU speed was more low level than the OS and was independent of the OS. I don't doubt real world performance but theoretically shouldn't it be the exact same? Unless I'm missing something here like memory addressing or some weird concept that somehow magically affects it.
Executing 64 bit instructions would logically utilize more of the circuits in the CPU, thus, more power consumption and more heat. So it would be logical that a CPU could over-clock higher if it has been "dampened" by only having to execute 32 bit instructions.
I think the more important question would be: Is a modern 3.5GHZ machine limited to 32 bit instructions going to perform better than a 3.4GHZ 64 bit enabled machine?
Being able to take advantage of 64 bit instructions will translate to better computing performance than a slightly higher clock.
Having access to more RAM can provide far better computing performance in situations that can utilize it than a slightly higher clock.
A forum community dedicated to overclocking enthusiasts and testing the limits of computing. Come join the discussion about computing, builds, collections, displays, models, styles, scales, specifications, reviews, accessories, classifieds, and more!