Originally Posted by rv8000
It doesn't work that way.
It absolutely works this way. Performance increases GPU to GPU are relative to the performance of the previous GPU, not to some arbitrary resolution or framerate numbers. We could be talking about 640x480 or 8k resolution; the trend is that doubling from 60 to 120fps at a given resolution in current titles takes 6-8 years.
Trends can change. But I think this trend will continue. And if the trend changes, it will have nothing to do with the fact that 4k is a higher resolution than 1440p.
--EDIT-- Also, I don't think you are understanding the jumps in GPU architecture performance correctly...
GTX 285 (big Tesla) --> GTX 580 (big Fermi) was about a 65% performance increase
GTX 580 --> GTX 780 Ti (big Kepler) was about a 65% performance improvement
780 Ti --> 980 Ti (big Maxwell) was about a 43% performance increase
980 Ti --> 1080 Ti (big Pascal) was around an 85% performance increase
1080 Ti --> 2080 Ti (big Turing) was about a 38% performance increase
This is reasonably consistent over time. Maxwell was a slightly smaller increase because it was on the same process node as Kepler. Turing was because of all the ray tracing hardware, and it was only kinda sorta on a new process.
If we average this all out, that's a 60% improvement per generation. That means that 2-3 generations after Ampere - my stated timeframe for when I expect 4k120 to be a thing - the GPUs will be 4-7x as powerful in terms of relative gaming performance compared to the 2080 Ti. It's very reasonable to assume that GPUs several times more powerful than what's available in consoles will be able to hit the standard target resolution of the generation (4k) at high framerate (120fps).