"SSD caching is an arguably less tangible benefit. Intel relies heavily on “smart” caching algorithms, which deliberately try to ignore large sequential data streams and the types of access patterns typical of anti-virus scans, for example. Anything that the software guesses will only be touched once doesn't get moved to the SSD. The emphasis is placed on application, boot, and user data, and that information is non-volatile, meaning it carries over between reboots. Unfortunately, between our Z68 preview and this piece, the only clear gain appeared to be game level-loading. Even when we use the caching-optimized Intel SSD 311, we have a hard time making a strong case for caching. I'd still rather make a jump from hard drives to SSDs with a more manually-controlled storage hierarchy. Certain information lives exclusively on a large-enough SSD, and less performance-sensitive data is housed on the hard drive.
See, most SSDs offer better read and write performance than magnetic storage. When you write to the hard drive, you're writing to the SSD at the same time, but you're really limited to the disk's write speed. The benefit of caching is really one of convenience. You can set up a small drive like the SSD 311 and use your system as if it wasn't even there, enjoying a benchmarkable speed-up in certain read-oriented workloads. So long as you don't handicap your storage subsystem with a cache that writes slower than your hard drive, performance is either a wash or slightly better.
On the other hand, if you're able to manage your own data intelligently, it's far better to get your operating system and apps on the solid-state storage, then move the movies and music onto disk. That data wouldn't get cached by Intel's technology anyway, given its size, so you're not losing out on any performance by going the "boot drive" route."