Originally Posted by Aparition
So I finished the 2033 playthrough and loaded up LL. Defaulted settings to Very High, no vsync, no ssaa. The first playable scene... 90+fps (as high as 110). (48 fps was for 2033 on high settings)
I did not run the bench but I expect to have pretty good FPS for Last Light
Probably has to do with this,
Ordinarily, this level of detail would require gigabytes of system memory and GPU VRAM, but thanks to a highly efficient streaming system Last Light’s world uses less than 4GB of memory, and less than 2GB of VRAM, even at 2560x1440 with every setting enabled and maxed out. More impressive still is the fact that there are no streaming texture errors, or instances of textures visibly popping from low to high quality as the player moves through the world.
Similarly, CPU usage is fine-tuned for maximum performance, allocating tasks such as the rendering of physics effects or the playing of sound to any available CPU thread instead of pre-determining the rendering of physics to thread two, and sound to thread three. This ensures that every task is completed as quickly and efficiently as possible, and that every ounce of CPU power is used in the most demanding moments, improving performance by a considerable margin in comparison to traditionally-threaded games and engines.
Of course, it would be remiss of us to ignore 2033’s performance, which could certainly be classified as “unoptimized” when maxing out each of the game’s settings. As stated earlier, 2033 was 4A’s first game, and was built on a limited budget. Furthermore, one could also argue that 2033 was ahead of its time, featuring settings and technology that were more than the then-fastest GPU, the GeForce GTX 480, could handle. For Last Light, 4A focused on optimization from the off, rewriting code to be as efficient as possible, enabling Last Light to run significantly quicker than 2033, even with its new, more advanced features.
Underlining this drive for optimization is the removal of DirectX 9 and DirectX 10 from the in-game menus, rendering Last Light’s graphics automatically via DirectX 11. Extensive testing found the pair to be up to 15% slower than DirectX 11, even with the game’s DirectX 11 render path automatically enabling extra features. As gamers without an interest in tech will likely never learn of these render speed improvements, 4A default the engine to DirectX 11 to ensure that players receive the maximum level of performance at all times.