Ill try to explain it more clearly, step by step, so even the mathematically challenged can follow.
We want to compare the requirements to drive two screens at their full resolution and full refreshrate.
1. 3440x1440, 100Hz refreshrate. We want to feed this 100fps.
2. 2560x1440, 144Hz refreshrate, We want to feed this 144fps.
Screen 1 has 3440x1440=4953600 pixels. Screen 2 has 2560x1440=3686400.
Does that tell us anything useful? Not really. If they had the same exact refreshrate, this would be enough to compare, but they don't.
This is where a tiny bit of simple physics come in. https://en.wikipedia.org/wiki/Hertz
Frequency, measured in Hz, describes how often something occurs each second.
This means that screen 1 accepts a new frame at most 100 times per second, while screen two accepts a new frame at most 144 times per second.
One frame on screen 1 is 4953600 pixels, which we need to feed it 100 times per second to max it out. (Running at 100 frames per second, 100Hz).
Pushing 4953600 pixels 100 times per second gives a total of 495,360,000 pixels/second.
One frame on screen 2 is 3686400 pixels, which we need to feed it 144 times per second to max it out. (Running at 144 frames per second, 144Hz).
Pushing 3686400 pixels 144 times per second gives a total of 530,841,600 pixels/second.
Now we have finally found common ground, and the values can be compared:
To drive these screens at max fps/Hz:
1. 3440x1440, 100Hz refreshrate, 100fps requires 495,360,000 pixels/second.
2. 2560x1440, 144Hz refreshrate, 144fps requires 530,841,600 pixels/second. (Roughly 7% more, no big difference, which makes perfect sense since both are designed near the bandwidth limit of DP1.2).
And again, as mentioned above, 3440x1440@100Hz is probably still a bit harder to drive due to extra cpu load caused by larger scenes, despite the lower pixelcount.
If someone still does not understand, it is beyond my ability to help you, sorry.Edited by Morkai - 11/29/15 at 9:31am