New Posts  All Forums:

Posts by TranquilTempest

They're not doing work for nothing. They're trying to stay relevant. Getting the monitor manufacturers to implement it is the hard part.
A-sync and g-sync should probably be considered separate markets for the purpose of competition. A-sync monitors could be a hundred dollars more expensive than g-sync and still sell, or vice versa, because that's less expensive than buying a new graphics card too. AMD and Nvidia don't determine prices, that's up tho the monitor manufacturers, and it will be set based on availability and demand.
Well I have my liI have a list of questions, but I've been sitting in an accout activation queue for a day now. If someone wants to post my questions:
I'm more inclined to believe management underestimated the problem and made impossible schedule/budget demands of their engineers. Say you're presented with a problem that will take a year of cooperation with display manufacturers to solve, and you're told you have one month to make a demo for CES, what do you do?
There are two possibilites: Either the demos can't do variable refresh, or AMD twice failed to make a demo that proves their capability.So of those two possibilities, which is more likely?
That's not right, regardless of whether the display is g-sync or freesync, it needs a tcon to drive the display, sometimes it's on a separate board, sometimes it's on the same board as the scaler, and sometimes there is no scaler. Lack of scaler means there's only one supported resolution.that's a counterexample, and it's not a question of shutting off the screen, it's a question of time. A few milliseconds to switch refresh rate is too much for g-sync/freesync.
Where was this latency data from, and what was the methodology? The only latency tests I've seen regarding g-sync were done with a modified mouse and a high speed camera, which isn't synchronized to frame completion, and therefore won't show that kind of step, if it exists.
The buffer that holds the previous frame is also used for PSR.
There should be negligible latency for both implementations in normal use, however the fallback cases at minimum and maximum refresh rate will differ. G-sync fallback cases are panel self refresh for minimum refresh rate, and double buffered v-sync for maximum refresh rate. There's no information on how freesync handles these cases, but they have more GPU side problems to solve because they're not using panel self refresh. There's little, if any, room for freesync to...
Maybe if you found a motherboard with 16x slots in the second and fourth rows, so you could put the sound card in slot 1, and use a 5 slot case. that would drastically limit your case and mobo options though, if you even HAVE any mobo options.
New Posts  All Forums: