Originally Posted by Zero4549
I'd love to hear your logic on that one.
The whole point of moving to a new Direct X API Version is to allow for greater efficiency, and to standardize support for new features. It should never be "harder" for a properly coded game to run visually identical graphics in DX11 than in 10 or 9 - It should be "easier" due to increased efficiency.
Furthermore, if a game is not written to use DX11, the graphics card cant force it to. A DX11 card will simply default to whatever version of the API that the game is using. If the game is a DX9 game, it will be running it in DX9.
Now, if your claim is that writing the exact same code in DX11 could make a game harder to run than that same code in DX9, OK, MAYBE. That was certainly true for (MUCH) older versions of the API due to significantly increased overhead due to new instructions. I doubt that it would make any significant difference in this situation however. Besides, if any programmer actually did that in the first place they would have to be an idiot.
The only thing that I can possibly imagine being somewhat true is that you were referring to a situation like Crysis 1, where running the same graphics settings in DX10 ran slower than in DX9. Even that is a moot point as they didn't ACTUALLY have the same settings. Switching to DX10 automatically toggled on several additional graphical features which could never be turned on in dx9, regardless of the settings you picked. It's not that DX10 ran slower, its that "high" in DX10 was running enough to be considered "ultra ultra high" by dx9 standards, even if you didn't necessarily visually notice all of it.
I know this, but generally the games you want to run look alot better in DX11. Such as dirt 2
BF3 is alot easier on my cards if i run it at DX10, but it's just that i am forced to have it on lowest.
EDIT: Yes the Crysis example was what i meant.