Originally Posted by Defoler
When you post a chart which is so far inaccurate, you lost your point. So lost it that it went right out the window, crashed on the pavement, and oozed itself into the sewers. Never to be seen again.
That is funny
because "freesync" implementation is... exactly like g-sync. Except instead of AMD making the hardware and software part, they make the monitor manufacturers do it.
The FreeSync implementation is NOT "exactly like g-sync". You obviously know nothing about G-Sync. So let me educate people on one key difference. G-Sync, because it has hardware, had memory and the ability to do frame multiplication. Thus when the FPS dips below the monitors low end range, it can sent the same frame out again, thus doubling (or more) the number of frames and thus allow the VRR range to be extended below that of the monitor. That is why, unlike the LIES that AMD is telling, G-Sync has a range of 2-240Hz, not the 30-144 like AMD claims (oh and since there is now a 165Hz G-Sync monitor, I guess that proves AMD was a liar as well). You can't do that with FreeSync, and no, monitor companies don't build in a frame buffer into their monitors. So as you can see, FreeSync is NOT "exactly like g-sync".
nVidia looked at the eDP standard, elected NOT to go that route because it was a worse solution and would take a lot longer to get implemented anyway (because of standards bodies ... then never move fast, and as everyone sees, it is now only an OPTIONAL standard). The eDP implimentation was not for gaming, it was for power savings. It COULD be modified into something else for gaming (and was, both by AMD and nVidia ... aka 'Mobile G-Sync'), but G-Sync is superior that FreeSync on the low end because of the hardware. The low end is where things are important anyway. Once you are getting 120+ FPS, it really isn't going to matter much if it's on or off, but at 25 or 30FPS, yeah, it REALLY makes a difference. Trust me, I know, I have 2 G-Sync monitors and use them every day, and have for awhile. If you are like me and like "turning up your settings" at 1440p (and very soon 4K), well then, the low end is what you need. If you don't care about looks and run on a 1080p TN panel, well then it isn't so much.
That's why when AMD got caught flat footed when nVidia released G-Sync, they had to scramble to counter it. The only way they could was to use what nVidia threw away, the eDP solution. That is why when AMD showed it, it was a cobbled together laptop screen and has lots of issues. Even then, it took another full year to get it some what working on the desktop, and even then there were issues. Entire "released" monitors had to be sent back after being certified, because of overdrive issues. It is only now, about 2 years later, that FreeSync is STARTING to take off. And if it does, nVidia will support it ... for their LOW END solution. But they will continue to offer G-Sync for those who want a superior solution that A-Sync/FreeSync can't. Oh, and speaking of, let's not forget that A-Sync (AdaptiveSync) and FreeSync are NOT the same. FreeSync is a PROPRIETARY version of A-Sync. Intel is adopting A-Sync, not FreeSync. Odds are when nVidia has to, they too will be like Intel and adopt A-Sync.
People love to say that AMD is all about "Open Standards", and yeah, Adaptive Sync is an OPTIONAL open standard, which is basically just something that already existed before AMD even touched it, in the eDP standard. But the fact of the matter is, FreeSync is NOT an open standard. It is a proprietary extension of Adaptive Sync.
Originally Posted by Defoler
And just to put the nails in the coffins, the ashes alpha benchmark shows the 980 TI and the Fury X almost neck to neck, before the 980 TI even supported async calls. I wonder what will happen when they do...