Originally Posted by Blameless
I didn't really see anything biased in these reviews. The displays tested do indeed appear to be disabling overdrive when FreeSync is enabled. This is going to be relevant for some and is an issue that should not be glossed over.
Quite possibly not an AMD issue.
No, its definitely an issue that AMD has saddled themselves with.
Putting the FreeSync logo on the box implies that it has been vetted and someone has sat in front of the panel for a significant amount of time to make sure that it works. Granted, they might not be putting as much work into validation as Nvidia might be doing with their partners, but I think its pretty obvious that this is the case when the specification for DP1.2a+ does include caveats that say that display vendors need to put in enough memory to allow some complex optimisations to occur. As soon as I find the paper about it, I'll post up the relevant points here. What it points to is the fact that AMD knew about these problems beforehand, but the first generation of FreeSync panels might have to have their behaviour fixed in drivers before the second gen fixes the lingering issues from the launch models.
Originally Posted by FreeElectron
- Open source inspected software should be used in this comparison to ensure that the software does not cause any issues with competitor's products and to ensure that the software will evenly work with both manufacturers products.
- Did not logically make sense, because while we can see that nvidia doubles/multiplies the refresh rate we can also see that AMD is using the minimum acceptable refresh rate (acceptable/good according to their own claim at the beginning of the video when they said that freesync is working well within the refresh rate window) so refresh rate and the whole measurement thing is useless because the only thing that matters (till the point i watched) in this review is their own observation (not the measurement as AMD did not fall below monitor's refresh rate).
- Gsync and ULMB did not work in the same time in the latest (and best) gaming gsync monitor released yet.
- I also have read somewhere that Gsync and ULMB can't work together (Generally) but i can't get the source right now.
- nvidia's claims are not to be trusted specially after the latest 3.5GB VRAM issue "miscommunication".
- so this point did not make sense as well because even if AMD can't have FreeSync and Blur Reduction working in the same time so does nVidia.
I would appreciate any real effort that will help further educate me.
1) This is ridiculous to assert, because why would Nvidia spend time to make a demo that cracks down on the performance of their competitor's product? They already have majority market share, they already have the better VRR solution. It would be revealed pretty soon through tools like FCAT, FRAPs, and high-speed video, that they were doing something like this because it is just a simple demo of a swinging pendulum - there is nothing technically taxing about it. The pendulum demo was also available to download a while ago, IIRC, and no-one noticed any glaring issues with AMD cards back then. Do you also complain about the use of FCAT just as much in other scenarios?
2) You're trusting what your own eyes see in the video, but you clearly haven't seen this in the works yourself. You can actually test this out for yourself on your existing monitor. Run a game with FRAPS in the corner and implement level caps in the game (easily done with TF2 and the command line in-game). If you're on a 60Hz monitor, drop the fps cap from 60 to 30 and then 20fps. The smoothness clearly suffers with each drop, but the animation is more or less okay. If you then create a custom resolution with a refresh rate at 50Hz and drop fps down to 25, the experience is more or less the same as running a 60Hz monitor with a 30fps level cap in-game. That's basically what is going on with G-Sync here, where it doubles or triples the refresh rate to keep things looking more or less normal and to avoid judder and massive stutters or tears. Obviously, running a game at sub-24fps won't look good, but at least they're trying to preserve some fluidity.
My hunch is that the scalers inside the FreeSync monitors don't have enough memory to perform the same trick as the G-Sync, which could be why overdrive isn't working and why the panel doesn't change the refresh rate to improve how the content is being displayed. AMD can fix that in their drivers, but it will be several months before it is ironed out.
3) This is partly physics-related and partly related to how the display is refreshing itself. You can have the strobe match the display rate, but you'd have to implement frametime smoothing to take out any severe dips or increases in framerate, because otherwise you'll run into situations where the display dims when it runs into a low-fps situation, then suddenly increases in brightness then the framerate improves. Even with the smoothing, your display would still have varying brightness levels throughout your time gaming on it, so its not ideal. Additionally, you will run into situations where the game engine hiccups and suddenly the framerate is out of sync - then you would have the strobe light up the display when it is blank, turning itself off when there's actually a frame being drawn on the display.
4) As for blur reduction not working, I'll try find the paper that discusses the specifications needed for a display to support VRR according to the DP1.2a+ specification. One of the requirements is having enough memory for the scaler to perform frame analysis and work out how much it should increase or retard the overdrive to reduce blurring. It is something that can be fixed, but probably only in the second wave of monitors supporting Adaptive Sync.
5) I'm pretty sure that hooking up an oscilloscope to a monitor, requesting custom software changes in a demo to point out differences in display technology and taking the time to figure out when the scalers do their thing is a real effort on PC Perspective's part. I don't see anyone complaining about Allyn pointing out that OCZ is introducing something that looks like a hiccup when you're writing constantly to the Vector 180, which hasn't been caught on any other review of the drive that I've seen. These guys do go the extra mile when they have the equipment to do so.
Edited by CataclysmZA - 3/29/15 at 1:53am