Originally Posted by UnrulyCactus
I think it's sort of amazing that we're getting to the point that we're having some trouble getting software to fully utilize and take advantage of CPUs. We're doing a good job with GPUs, especially now that 4K TVs are out, but with CPUs, I feel like there's a slight software stagnation. Am I right? I'd like to hear thoughts.
Absolutely not. Most consumer daily software are simply just not that intensive. They wait for user input, have a burst of activity, and then wait again. Even most games follow this usage paradigm.
Then there's really intensive software like weather modeling, video editing, hashing, oil exploration, HFT, nuclear modeling, etc that will eat CPU cores indefinitely.