Overclock.net › Forums › Components › Monitors and Displays › ACER Predator X34 Owners Club
New Posts  All Forums:Forum Nav:

ACER Predator X34 Owners Club - Page 38

post #371 of 3055
Quote:
Originally Posted by Mikey- View Post

Mountainlifter,

This is going to surprise you - and I honestly don't understand it myself.

But the pendulum demo scan lines results are NOT the same as the results I'm seeing in-game. I'm not joking. You have to try my test... I can't emphasize this enough... with a real game, not the Pendulum demo.

If you have Watch Dogs, then try that. If you have Fallout 4, then try that. Something about that Pendulum demo is off - it's producing scanlines when the overclock is disabled, whereas in actual games when you disable the overclock the lines go away at any frequency. That's what I'm seeing anyhow.

I know... it makes zero sense.

I fully believe your observations and it does not surprise me at all. If it is the case that different games produce different results then all of this could be a driver issue which doesn't seem to be the case at the moment (as people have tried different driver sets). It could also be that the different games have different color palettes that either expose or obscure the scanlines.

I'm sure you'll agree, In order to conduct tests and study behaviour, we need one game or one demo in which to consistently reproduce results. Hence, using the pendulum demo makes sense as a worst-case test platform.

All of the above does not invalidate the fact that you might see the scanlines if you were to conduct this experiment (in the pendulum demo): with the refresh at 60Hz in NVCP (and monitor OC off), simulate 40fps in pendulum demo and quickly switch between G-sync and V-syncOFF (moving between 40 and 60Hz resp.) while looking at a patch of the monitor on the left side. If you see dark grey lines come and go (or intensify relative to one setting), then I say there are scanlines at 40Hz.

As always my only intention is to study the behaviour of this panel. Maybe it will help narrow down the cause and get a solution to our problem sooner.

EDIT: I suspect only the color grey or close to that color exposes the lines.
Edited by Mountainlifter - 11/14/15 at 7:45pm
post #372 of 3055
Quote:
Originally Posted by Mountainlifter View Post

I fully believe your observations and it does not surprise me at all. If it is the case that different games produce different results then all of this could be a driver issue which doesn't seem to be the case at the moment (as people have tried different driver sets). It could also be that the different games have different color palettes that either expose or obscure the scanlines.

I'm sure you'll agree, In order to conduct tests and study behaviour, we need one game or one demo in which to consistently reproduce results. Hence, using the pendulum demo makes sense as a worst-case test platform.

All of the above does not invalidate the fact that you might see the scanlines if you were to conduct this experiment (in the pendulum demo): with the refresh at 60Hz in NVCP (and monitor OC off), simulate 40fps in pendulum demo and quickly switch between G-sync and V-syncOFF (moving between 40 and 60Hz resp.) while looking at a patch of the monitor on the left side. If you see dark grey lines come and go (or intensify relative to one setting), then I say there are scanlines at 40Hz.

As always my only intention is to study the behaviour of this panel. Maybe it will help narrow down the cause and get a solution to our problem sooner.

EDIT: I suspect only the color grey or close to that color exposes the lines.

The one test we need to try is the one that can't be tried.

You believe this is an issue related to the frequency of the monitor - not Gsync, or the overclock - but the frequency itself. You also believe that the lower the frequency, the more prominent the scan lines.

Okay, if your theory is correct then what we really need to see is this monitor operating at 30Hz without Gsync or the overclock enabled. According to your theory, this panel, at 30Hz, with the overclock disabled, and with Gsync disabled (in other words, just a regular 60Hz panel, downclocked to 30Hz) would produce terrible scanlines.

There's no way to test that. We can't take this monitor down past 50Hz. Acer's own X34 driver is preventing us from doing that.

We can only 'downclock' to 30Hz with Gsync enabled. Clearly, in order to prove your theory, we need to test this monitor straight up at 30Hz without Gsync enabled.

Okay, you know what... I've got a bad feeling about this, because I now believe you're correct, and there's no way to fix it.

This is incredibly sobering.

EDIT:

Am I correct that if you set your monitor to 60Hz - fixed frequency - and then set Vsync as Adaptive (half the refresh rate), the monitor still refreshes at 60Hz even though your frame rate gets sliced in half?
Edited by Mikey- - 11/14/15 at 9:28pm
post #373 of 3055
Quote:
Originally Posted by Mikey- View Post

The one test we need to try is the one that can't be tried.

You believe this is an issue related to the frequency of the monitor - not Gsync, or the overclock - but the frequency itself. You also believe that the lower the frequency, the more prominent the scan lines.

Am I correct that if you set your monitor to 60Hz - fixed frequency - and then set Vsync as Adaptive (half the refresh rate), the monitor still refreshes at 60Hz even though your frame rate gets sliced in half?

Actually, I not sure what to believe. I am just throwing out a hypothesis and seeing if it fits the data.

There are three components to this puzzle: Panel (and it's row/column drivers, TCON IC etc), G-sync scaler board, and Overclocking (of the panel through G-sync scaler).

Even if we "disable G-sync" in NVCP, it is still the same G-sync scaler module doing the normal work. SO, I just leave G-sync enabled in NVP and when I say "40Hz+40fps" it means I'm forcing the pendulum demo to run at 40fps with G-sync ON. But don't go to 30Hz+30fps because G-sync behaves differently at 30fps and below.

The issue with blaming the OC from the OSD is that it goes through the G-sync scaler to the panel. And we already know one of the two is not working properly.

The best we can do is come up with a model that explains the behavior of all panels with scanlines and then figure out the best root cause. If my model of the panel behaviour is true (ie low refresh rates in a range), then it points to the panel being incorrectly tuned.

IF it is TCON related (ie. Panel related), it can be easily fixed by replacing the TCON board on the back of the panel by ACER. TCON ICs are attached to the back of the panel and should be separate from the G-sync Scaler. Most likely, the TCONs are poorly tuned. All the recent 1440 panels in the last year have been having this issue. Should take them just a few hours to replace the TCON ICs. That is assuming we can eliminate the G-sync scaler as the root cause.

IT could also be related to the panel in another way. This panel uses FRC aka temporal dithering which in combination with the panel's inversion scheme could result in artifacts. But if this is the true, then all panels should have scanlines but many clearly report perfect panels.

If it is the G-sync scaler, it could be things like how the pixel clocks are being managed by Nvidia. This could happen even if "g-sync" is disabled; it only turns off the VRR, the signal still goes through the same scaler.

For reading: http://120hz.net/showthread.php?1628-Disadvantages-to-using-non-auto-timings-for-monitor-overclocking&s=8fe7228d2c8a75d983bf9590011066f6 See post by ToastyX.

"NVIDIA swaps out the display’s scaler for a G-Sync board, leaving the panel and timing controller (TCON) untouched." http://www.anandtech.com/show/7582/nvidia-gsync-review

(If I've made a mistake, people with good knowledge of panels, please correct me.)

EDIT: On the day I got the monitor, I tested it out with HDMI input too which should not support G-sync. Yet, I still saw scanlines in the pendulum demo. That is another thing to consider and others can verify if they see the same. More and More, I am beginning to think this is a TCON thing, same as the swift from a year ago. But then, the freesync version is not seeing this issue which it should if it is problem with the panel. So, the cause of this problem is tough to nail down.
Edited by Mountainlifter - 11/14/15 at 10:50pm
post #374 of 3055
Mountainlifter,

I've discovered something interesting.

In Fallout 4 I used RTSS to limit my frames to 31. I then disabled my overclock, but left Gsync on (which is how I've happily been playing Fallout 4 - Gsync enabled, overclock off). Even with a disabled overclock, at 31Hz, I saw prominent scanlines. This surprised me, since I was certain that disabling the overclock had gotten rid of the scanlines.

I immediately thought... oh damn, so I was wrong... it doesn't matter if the overclock is on or off.

Actually, it does matter.

I immediately overclocked to 100hz, and went back into Fallout 4, disabling the RTSS frame limiter. When I loaded up Fallout 4 my frame rate happened to be low - 45 FPS. And guess what. There were prominent scanlines. I then immediately disabled the overclock, thinking, "No, no,no... I played the damned game all last night without seeing these bloody lines, and now here they are again... this can't be right." Happily, when the monitor lit up after the reboot, the scanlines had vanished.

Conclusion: Disabling the overclock means you can display at lower frequencies before the scanlines become an issue. If you use the overclock, on the other hand, then you should expect to see the scanline issue 'earlier', if you understand my meaning. In other words, you can't completely get rid of the scanlines, but you can 'stave them off'.

Essentially, what this means is we're both right. If your game regularly dips down to 45 FPS, then IT IS beneficial to disable the overclock.

EDIT: And, again, I'm okay with this because I won't be playing ANY games at 30 FPS. I will at 45 and above though, so disabling that overclock for troublesome games is going to be vital (if image quality is your thing).
Edited by Mikey- - 11/14/15 at 11:22pm
post #375 of 3055
Quote:
Originally Posted by Mikey- View Post

Mountainlifter,

I've discovered something interesting.

In Fallout 4 I used RTSS to limit my frames to 31. I then disabled my overclock, but left Gsync on (which is how I've happily been playing Fallout 4 - Gsync enabled, overclock off). Even with a disabled overclock, at 31Hz, I saw prominent scanlines. This surprised me, since I was certain that disabling the overclock had gotten rid of the scanlines.

I immediately thought... oh damn, so I was wrong... it doesn't matter if the overclock is on or off.

Actually, it does matter.

I immediately overclocked to 100hz, and went back into Fallout 4, disabling the RTSS frame limiter. When I loaded up Fallout 4 my frame rate happened to be low - 45 FPS. And guess what. There were prominent scanlines. I then immediately disabled the overclock, thinking, "No, no,no... I played the damned game all last night without seeing these bloody lines, and now here they are again... this can't be right." Happily, when the monitor lit up after the reboot, the scanlines had vanished.

Conclusion: Disabling the overclock means you can display at lower frequencies before the scanlines become an issue. If you use the overclock, on the other hand, then you should expect to see the scanline issue 'earlier', if you understand my meaning. In other words, you can't completely get rid of the scanlines, but you can 'stave them off'.

Essentially, what this means is we're both right. If your game regularly dips down to 45 FPS, then IT IS beneficial to disable the overclock.

Agree with what you have written.

You have re-worded exactly what I was trying to explain in this post using the term "lower end of a range of frequencies" http://www.overclock.net/t/1573121/acer-predator-x34-displays-show-your-images-or-experience-here/360#post_24608674

OC off doesn't mean scanlines vanish. It only means that the scanlines appear at a lower frequency than when using OC ON. So, if a game runs at 45-75 fps, better to have the OC off and have a ceiling at 60. If the game runs at 75 fps or greater, better to keep the OC on.
post #376 of 3055
Quote:
Originally Posted by Mountainlifter View Post

Agree with what you have written.

You have re-worded exactly what I was trying to explain in this post using the term "lower end of a range of frequencies" http://www.overclock.net/t/1573121/acer-predator-x34-displays-show-your-images-or-experience-here/360#post_24608674

OC off doesn't mean scanlines vanish. It only means that the scanlines appear at a lower frequency than when using OC ON. So, if a game runs at 45-75 fps, better to have the OC off and have a ceiling at 60. If the game runs at 75 fps or greater, better to keep the OC on.

This is very tricky to explain. Thanks for your input. You've really helped me to understand how my display works.
Edited by Mikey- - 11/14/15 at 11:32pm
post #377 of 3055
Dear all with scanlines (part 1),

Instead of supplying a wall of text, I decided to spend some time making a table with experimental results that describe the intensity of the scanline artifacts at various Refresh Rates (set in OSD + NVCP) in correlation to Gsync Fps+Hz.


direct link: http://i.imgur.com/a1qeTBm.png

In the columns, you see the Refresh Rate that is set both in the OSD OC menu and in the NVCP (This is important. Both must be the same.)

In the rows, you see the pendulum demo's forced FPS using its supplied sliders. I leave G-sync enabled in NVCP at all times and so, X fps also drives the monitor at X Hz. Hence, "XHz+Xfps".

How were the tests conducted?:
I go column by column in the table. I would set the Refresh Rate in the OSD (say 85Hz), reboot the monitor and set the same Refresh Rate in Nvcp. Then, I run the pendulum demo at 85Fps and check for scanlines. I move down by 5 fps/Hz using the demo's fps sliders and check again. Each time, I switch between G-sync On and V-sync-OFF to make sure my eyes are not seeing things. Each column was repeated at least twice - once from the top and second time from the bottom. Each entry in the table is the result of a real test and no interpolation or guess work was done. The only places where error could have entered are the transitions from FV to V and from V to PV. I have much greater confidence in the transition from NV to FV. I also made sure to sit at the same distance from the monitor during each test run.

Discussion:
These are the salient points from the table
  • The most important point to notice is that the scanlines are not visible close to the top of a given frequency range. That is, in each column, the first two entries have No scanlines (or the lines are of such low intensity that my eyes can't see them). Therefore, running at close to the top of a given frequency range should have no scanlines and vice versa for the bottom end of the range. The "frequency range" is defined by what is set in the OSD and not by what is set in the NVCP. Note that I cannot use the term "VRR window" in place of "frequency range". For instance, the frequency range of column B is 31-100Hz, while the G-sync VRR window is technically 1-100Hz. Users have reported scanlines even at the max frequency of a given frequency range. I can only explain this by saying that on your panel, the transition zones may have moved so far up so that you have no green boxes at all. On the other hand, some users have the transition zones moved so far down that they only have green boxes and so they report no scanlines at all. (lucky bastards!).
  • The second point to note is that while B7 and G7 (taking an example) are running at the exact same conditions, one has faint scanlines while the other doesn't have any at all. The only difference between B7 and G7 is the OSD OC Refresh Rate setting. So, to my mind, this implies that the Refresh Rate OC setting in the OSD is read in by the G-sync scaler and some calculations are done and parameters are set to allow VRR operation in that range. If so, then there is hope for this artifact via a firmware fix or a driver fix. Essentially, what I am saying is that the scanlines seem artificially introduced based on the OSD OC setting and do not seem to be because of some inherent panel problem. Who knows, it could be a combination of both.
  • It is strange that 35Hz+35fps has been registered as V and not PV. But switching quickly between 35 and 40, I could only conclude that 40 was slightly more intense. Also, I do not go to 30Hz+30fps because G-sync works differently at 30 fps and below. Maybe, 35 is also affected by the background G-sync calculations in this monitor.
  • If you set the OSD OC to 100Hz and use 75Hz in NVCP, you get the same scanline behaviour as setting 100Hz in OSD+NVCP ie. column B. (Only what you set in the OSD governs the scanline behaviour). It is also for this reason that I don't have a 50Hz column in the table because I can only set 50Hz in NVCP and cannot set 50Hz in the OSD. If you set 100Hz on the OC OSD and set 50Hz in the NVCP you can plainly see scanlines on the windows desktop which corroborates the scanline behaviour as described in column B.

Anyways, I think this model describes the behaviour of the scanlines in the Acer X34: It appears with increasing intensity towards the lower end of a frequency range (31Hz to XHz) whose max value X is governed only by what is set in the OSD OC menu (OC off is also a setting with X=60Hz). G-sync VRR operation itself has no bearing on scanlines.

PS: I'm attaching the excel sheet I used in case you want to conduct these tests yourself.
ACER_X34_SCANLINES_BEHAVIOUR.xlsx 49k .xlsx file

PS2: thanks Mikey and Smokey the Bear

If there are modifications needed to this model, let me know. I'll wait for some replies and then post this in the ACER forums for greater visibility there.
Edited by Mountainlifter - 11/16/15 at 9:41pm
post #378 of 3055
In other words, it's not enough for us to overclock our X34 to 100Hz and just leave it there. We have to overclock on a game by game basis.

3 examples:

1) In Bioshock Infinite, I can get 100 FPS easily, so I should overclock to 100Hz.
2) In The Witcher 3, I average around 75 FPS, with large dips, so I should overclock to 75Hz, or even 70Hz if the IQ appears muddy.
3) In Fallout 4 my performance is terrible, dropping to the low 40's in big cities, in which case I should disable the overclock completely, and accept 60Hz as my ceiling.

I hate to say this, Mountainlifter, but I think you're going to see two different responses to this chart. One group will say they're not seeing any scanlines at all. And the other will say they're seeing scanlines even at 100 FPS at 100Hz.

The chart seems pretty reasonable to me. I'll take a closer look tomorrow when I wake up. Honestly, with this information, and a fairly decent GPU setup, scanlines can and should be avoided. I think that some people were too quick to send back their monitors. The upcoming Asus monitor, when user-overclocked to 100Hz, is likely going to exhibit the same problems, though I suspect people won't see that, simply because Asus is aiming for a less aggressive overclock.

Me, I'll take the 100Hz monitor over the 75hz version any day, especially now that I know how to 'control' the scanline issue.


Also, what the hell is happening with G13? Were you smoking something? Or is that accurate?
Edited by Mikey- - 11/15/15 at 3:25am
post #379 of 3055
Quote:
Originally Posted by Mikey- View Post

In other words, it's not enough for us to overclock our X34 to 100Hz and just leave it there. We have to overclock on a game by game basis.

3 examples:

1) In Bioshock Infinite, I can get 100 FPS easily, so I should overclock to 100Hz.
2) In The Witcher 3, I average around 75 FPS, with large dips, so I should overclock to 75Hz, or even 70Hz if the IQ appears muddy.
3) In Fallout 4 my performance is terrible, dropping to the low 40's in big cities, in which case I should disable the overclock completely, and accept 60Hz as my ceiling.

Makes good sense.
Quote:
Originally Posted by Mikey- View Post


I hate to say this, Mountainlifter, but I think you're going to see two different responses to this chart. One group will say they're not seeing any scanlines at all. And the other will say they're seeing scanlines even at 100 FPS at 100Hz.

I tried to address this in the first point under "discussion".

Good night.

EDIT: G13 is a transition box. Could have been V or PV. I decided to use PV at that point. I didn't do the columns in order so that the frequencies I was studying were not close together. I started with column G in fact.

So, the transition points could be one box up or down except for the transitions between NV and FV. Those I am very confident about.
Edited by Mountainlifter - 11/15/15 at 8:32am
post #380 of 3055
Regarding these scan lines, I honestly have a hard time trying to spot them on the pendulum demo... whatever settings, vsync, gsync or not, OC or not, whatever fps, I can't really see anything... maybe , maybe at low framerates 40 fps, I might see some faint horizontal lines by sticking my nose on the screen, but so faint I wouldn't have spotted anything if I wasn't trying to...

So either I'm very lucky at the monitor lottery draw, either we have some other factors to consider.

Can I give my two cents worth? Could we be having some interference with the current feed? Out of curiosity, it would be interesting to know at what Hz and whether people having scan lines are using a UPS and what type? I think we have some difference 50 Hz / 60 Hz between Europe / USA, and also you know that some UPS's deliver square wave, which computer PSU's don't really care about, but which could create some harmonics and hence interference with the monitor...? Maybe some expert would like to kick in here.

My current input:
- 230 V at 50 Hz (France)
- Line-interactive UPS with pure sine wave (not square wave)

I know the power brick is supposed to provide DC current to the monitor, however it might be impacted by what's coming in... not so easy for me to test out though as I don't have access to another UPS or current Hz...
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › ACER Predator X34 Owners Club