Originally Posted by bobbyt2012
... The first variable being temperature and the second being clock speed. Hopefully I can keep the voltage at a constant (is that a bad idea? what should I be doing voltage wise?) With the second test, I will be determining how much of a performance increase that temperature can cause.
At a specific voltage, a CPU will max out at the same speed no matter what the temperature is. The whole reason for reducing the temperature is to elininate the heat caused by running more current through the processor so the processor doesn't throttle (intel CPUs) or just shut down (AMD CPUs).
You may want to look into the "cold bug" phenomon. Certain CPUs will not boot (though they will run) at sub zero temperatures. There are many, many other roadblocks (that you are not aware of) in getting your experiment to work, much less being able to prove your hypotheses.
Your experiments are well designed, it's just that the outcomes are very predictable. But then again, that's scientific method. I know, because I was a research chemist for 15 years (I have some pattents to show for it too!).
May I suggest that you can elininate a lot of expense and problems by doing your experiment on a piece of wire. Put a current through the wire and see if the current changes at different temps. Then, for part 2, measure how much current it takes to melt the wire at different temperatures. Surely not a "elegant" as what you propose, but does demonstrate the same principals at much less cost and effort with a much greater chance of success.Edited by billbartuska - 10/27/09 at 6:42am