I have a hypothetical question- let's assume that there are two CPU's with the same microarchitecture and the same number of transistors per core and with the same size L1/L2/L3 cache, etc. The only difference is that first CPU has twice as fast cores compared to the second CPU while second CPU has twice as many cores compared to the first CPU. For example, 4x 4GHz cores versus 8x 2GHz cores. Is it correct to assume that that CPU with more cores performs better than CPU with less cores in heavily multithreaded environment even if the sum of the clock speed of all the cores is the same? If yes, then what are the reasons for this? I guess one of the reasons is context switching, i.e. CPU with less cores needs to schedule tasks between its fast cores more often and this scheduling itself takes time. Am I correct? In addition, modern multi-core CPUs tend to have L2 cache per core, i.e. in total there is more fast SRAM available. Are there any other reasons?
post #1 of 5
10/4/15 at 6:14pm