Originally Posted by Manyak
It has nothing to do with optimization, other than being able to run multiple threads.
Like you said, secondary "virtual" cores are activated when the primary core is sitting idle. Primary cores sit idle whenever the CPU has to wait on something - an instruction that takes more than 1 clock cycle to complete, a cache miss, more data from memory, etc. This happens the most frequently with data-intensive calculations, since those require the most cache and memory accesses. Those things you listed, like video encoding and decompressing files, all have this same thing in common.
As I understand it, a programme that is efficiently coded ought to leave the CPU to idle less frequently than one that is coded poorly, however that's third-hand information, so you may well be right.
Originally Posted by Hatfieco
Sandy bridge is to Ivy Bridge what the p55 chipset was to the x58. I would do the 1366 just for the shear fact you get so much more memory bandwidth through it. When it comes to encoding and actually using the processor that huge memory bandwidth comes into play. Not only that but if you plan to sli I think the sandy only has 16x pci/e lanes where the 1366 has 32x and allows a full 16x/16x dual card setup. Cant go wrong with either. Just depends on whether you want to hop on the sandy bandwagon like everyone else is lol. Just remember IVY is right around the corner as is bulldozer from amd.
Not true. Ivy Bridge is a die-shrink of the current architecture. You have yourself confused. The new Intel-Extreme-only socket due out (LGA 1356) will contain Ivy Bridge chips, but so will the existing 1155 socket.
The thing about LGA 1356 is that it will cost a fortune to purchase, being Intel Extreme only (whose chips have historically been far worse value for money than chips even one step down the price OR performance ladder).
The bandwidth might be useful, but benchmarks show sandy bridge beating the older architectures the majority of the time. You cannot refute this. Even the "old" i7 980x with its six cores struggles to keep up with the newer i7 2600k at times.
The average person will not be looking at buying a 980x, so you're wrong here.
Also, it has been shown time and again that the difference from x16 to x8 bandwidth does not play a significant role in single-GPU card. I have not seen anything comparing them with the new GTX 590 or the 6990, but previously, it gave a 0.5-3% performance decrease in some (but not all) games.
Typically, the decrease is either non-existent, or approaching 1%. If you don't believe me, I can go and find the benchmarks, but it was my belief that this was common knowledge.
I'm not saying that the 1366 socket is useless, only that the 1155 socket is better for what this guy is after. Ivy Bridge is scheduled to be somewhere between 20% more efficient, or 20% more powerful (or somewhere between those two) according to recent sources. Bulldozer is almost a completely unknown factor.
Why wait when he can get this now? Those are still a while away, and whatever happens, the i5 2500k or the i7 2600k will end up being among the best processors in the world for the next several years. Even if they end up relegated to the bottom end of the top ten after the next architectural release.