(In reply to the previous news thread, which TBH the ~20% Apex improvement alone is newsworthy...)
I was reading on anandtech that the difference is like this:
before the minimum pre rendered frames was 1, but if you go back a decade or so the drivers did at one point feature a '0' max prerendered frames. Not sure if there were compatibility issues or nvidia wanted something to keep in their back pocket for marketing purposes, but anyways the 'Ultra' setting = the old '0' 'On' = 1 and off = 1-3 depending on the developer/game.
It would be nice for someone to do some in-depth about how this setting ultimately affects frametimes and framepacing. Most people would assume frametime > framepacing but if there are other variances such as inconsistent mouse movements, perceivable stutters, etc. it may not be worth it at all.
I think to get the most out of something like this you would want to have your frames capped at a level your GPU can reasonably produce with something like rivatuner, and then set the max prerendered to 0 or 'ultra', that way you know you are being served frames within a certain timeframe and also that none of them are being held back to pace (which you don't need since it is already being artificially capped). If you are pinning your GPU 100% I would imagine this setting could become an issue.
Gigabyte Z370 Gaming 7
EVGA RTX 2080 Ti XC ULTRA
Kingston HyperX Predator RGB
ADATA XPG SX8200
EVGA SUPERNOVA G2
EK Velocity Nickel-Plexi, EK Vector RTX Nickel-Plexi, EK EX360, Koolance 360mm, Enermax Neochanger
Logitech G502 Spectrum
Audiotechnica ATH-900X, SBX G5, MODMIC
▲ hide details ▲