I actually disagree for both games. Crysis 1 was fun and allowed more freedom and openness in how you went about your job. Metro 2033 was VERY closed and had small maps but it told a story and set the atmosphere better than any game I can think of.
But what Crytek did in Crysis 2 was to obviously take out some of that openness (although not NEARLY as closed in as Metro or Half-Life 2 ) in favor of trying to tell a more tailored story.
Your primary issue is not a problem for Developers. Their first, second and third concerns are sales of GAMES, not GPUs. Nvidia or AMD selling more or less GPUs won't help them in any single way, it will only force them to increase their budget as they have to add MORE staff to accommodate MORE tech.
I've already articulated this 2 or 3 times now so pay attention. If a heavily tessellated benchmark like Unigine/Heaven, with no AI, enemies running around or ANYTHING happening other than you looking at the scenery, causes almost every single GPU to max out in the 20-fps range, what do you think would happen with 5-10 soldiers shooting at you as you shoot back and run for cover? Slideshow. So no, a simple cheap 5770 or 450 won't do b/c Tessellation requires far more of GPUs.
Even with my 5850 O/C'd, Metro 2033 ( super enclosed and small maps ) turns into a slideshow when I throw a grenade and the particles spread out from the blast.
And as we've seen with Crysis 1, slideshows may look pretty but they detract from the game.
And who cares about tesselation? All it does is smooth out models (it's alot less intensive just to throw on some AA), it's not the only thing that DX11 brings to the table.