If you take a large enough sample (even one second's worth of data) you will get an average which will approximately equal (very closely) any average that you will get with any kind of reasonable data set.
While the average will not be equal to every data sample (obviously) it is an average that tells you much more about the CPU's capabilities than saying that it has 12 units.
Saying that changing data makes an average useless is ludicrous. That's the purpose of an average - to allow you to view a set of varied data without the need to deal with each individual case.
I have no idea where you're coming from - of course individual data will vary from the mean - that's why its the mean and not the only possible value
The reason why an average can never be determined is due to the huge amount of operations in regards to clock frequency that operate per cycle.
For a single cycle (on my processor) 3.21x109
is equivalent to the processing power per logic cycle.
For each cycle 3.21x109
"averages" will occur. As there are far too many instructions being processed, the average that could be determined would be too fast for the human mind to understand and contemplate. The average that could be used will probably occur something like 1x10-40
of all clock cycles in one period. Therefore under logic you can not derive ever at an average because it will never occur as an integer.
You may not understand this. I've been taught about averages in regards to processor computation
Due to the fact that C2D can do 12op, has different instructions sets(SSE4?), It is a much better processor. I'm trying to 'make up' for this with a faster clock rate.
Core 2 Duo does not contain SSE4, in contains Supplemental SSE3 (SSSE3+).