That would only work on monitors that support Adaptive-Sync, which is basically only the FreeSync monitors announced at CES. It might be possible on other monitors but not without a firmware update on the monitor side too.
Not all. I'll just leave a word from an AMD rep here, he can explain better than I can. But yeah you're right monitor vendors are not going to be motivated to provide firmware updates for free when they could sell new monitors http://linustechtips.com/main/topic/300718-amd-caught-on-lying-as-well-falsely-presenting-a-working-variable-refresh-rate-monitor/?p=4090681
G-SYNC modules aren't just DRM because they actually do stuff. If you don't have a G-SYNC module then you need some other way of supporting variable refresh. Laptops can have support through eDP which has a standard for variable refresh, but desktop monitors don't have eDP. DisplayPort 1.2a can optionally support it, and NVIDIA could release software to support variable refresh via Adaptive-Sync on supporting monitors (which is how I suspect mobile G-SYNC works) but it...
From what I've gathered so far, it's possible the guy may have somehow created software for variable refresh over VESA Adaptive-Sync, the same way FreeSync works. Theoretically all you would need is software support on both sides (display and GPU) and a supporting connection (eDP/DP 1.2a).
I've tried this G-SYNC hack personally with my 780 Ti and three different DisplayPort monitors, a Dell U2414H and U2415 (both DP 1.2a, although without the optional Adaptive-Sync...
4:8 would mean one number is double the other; for every 4 pixels in one direction, there are 8 pixels in the other direction. This is not the case, since 5760 is clearly much more than double 1200.
1920x1200 is 16:10, so three of those side by side would have triple the width, but the same height. Therefore, 48:10, or 24:5.
When you hook up a PC to a TV via HDMI, TVs will usually overscan (the edges of the picture will go off the screen) unless you change it to "PC mode" or 1:1 pixel mapping or something like that, just due to differences in how PCs communicate with monitors compared to how TVs communicate with home theater devices. So when you use HDMI, AMD's software will set underscan by default to compensate for TVs overscanning. Even if it's a monitor on the other end.