Well that is not exactly what they are doing.
Nvidia are running a game at super sampling higher resolution, and then based on that information, know where and when they don't need to run high sample.
It isn't just rendering something at 1080p and upscale to 1440p, but it runs at higher than 1440p, checks where and when it can scale it down, use its "AI" to predict the AA requirements, and then run the AA through both upscaled and downscaled textures.
The more times it runs the same scenarios, the better the profile becomes, and then it can run DLSS better, making it as good as regular AA.
You can read about a bit of it here
(1 minute of google search...).
But that happens to every game or benchmark.
Once the system runs enough times to predict the DLSS usage, it knows when to use it even on a dynamic scenario.
Of course on a benchmark as you run the same thing over and over and over again, you get the best results. But do the same to a game, and you might also get better results.