The reason a lot of games show no difference between i5 and i7 w/ HT is because many games don't even utilize more than four cores, let alone fully using the four cores they already can utilize. Very few games use more than four nowadays. Some engines like Frostbite 2 and 3 can use up to 8 threads, where it was shown that in a 64-player MP match on an FX-8350, that all eight threads were being utilized constantly by the game, though only maybe one or two threads were truly being taxed. HT doesn't scale as well as "real" cores do, at least not for Intel chips (IBM's implementation of SMT on the other hand, shows nice gains in many cases), but for games that support more than 4 threads, any extra threads would help. This is why i3 chips have shown actual improvements in games that use 4 cores, but you don't see such improvements with i7's over an i5 because an i5 already has four physical cores on it.
Also not all MP games are CPU monsters. Even in BF3 it only gets crazy when you go around the 40+ player mark. Not many games even go beyond 16 or 24, so it's not that big of a concern.
I really wish that the myth regarding CPU performance would stop being repeated. CPU does matter, especially with 120Hz gaming / SLI / CF.
It's mostly a myth that is a counter-myth. The usual myth is that you can only get acceptable framerates with any game at any resolution using some BS $1000 Intel Extreme CPU. That is nonsense, and then people say the myth that CPU doesn't matter at all.
The truth is that strong CPU's sometimes matter, but most of the time they don't. People with these insane triple-monitor 9000k resolution in 3D setups only make up a very, very, VERY tiny portion of the overall populace that does gaming. Even the Steam survey shows that most people are using old hardware and the most used hardware is far from top-of-the-line stuff.
For most people, who game at around 1080p or so, all they need is any modern multi-core (preferably a quad-core) with a decent clockspeed, combined with a decent mid-range GPU and they can play any game out there at high or max settings at 60fps. Another myth is that there is a crapload of games that rely heavily on single-threaded performance and this isn't really that true. There are some indeed, but they are outnumbered by the amount of games that either don't require monster CPU specs or are properly multi-threaded.
One example is Planetside 2, which is known for slowing down in heavier gunfights even on Intel-powered rigs with beastly GPU's. A recent interview explained that the reason for that is that the game mainly runs on one dominant thread, and they admitted that the game was coded to work best on Intel CPU's because of the stronger single-threaded throughput. Since SOE is bring the game to the PlayStation 4, which is using a custom APU that has eight x86 Jaguar cores in it clocked at 1.6 GHz, they decided they had to re-code the game to make it truly multi-threaded. They then said that this will be something which will be provided in an update on the PC version later, so everyone -- regardless of CPU they are using -- will get a very noticeable speed boost.
I'm actually glad that both Sony and MS went with x86-64 based architectures for the consoles, because it will force devs to stop doing such lazy sloppy coding all the time and finally making more games and engines that support multi-threading. I'm not saying it's a very easy thing to do, because many times it isn't. And some programs wouldn't benefit from multi-threading even if it were implemented. However, PC games have long been barred with sloppy port jobs and unoptimized crap code, even for games that didn't come from a console originally. It needs to stop, and I'm glad there will be a greater push towards parallelization in the near future.