Overclock.net banner

[144Hz] [BR] [G-sync] [FreeSync] Whole truth about 144Hz monitors and blur reduction/syncing technologies

10K views 34 replies 13 participants last post by  MadjinnSayan 
#1 ·
Worldwide investigation about 144hz monitors and blur reduction/syncing technologies.

First about Blur Reduction (ULMB, LightBoost) and G/Free-syncs.
BLUR REDUCTION (also known as ULMB, LightBoost) is useless for gaming. Absolutely.
I've tested few types of situations when you need to notice an enemy while fast-turning. For example, fast left-right check (90 deg left + 180 deg right) done in ~<0.5sec in CSGO. And also some fast-turning situations in Reflex and CPMA.
And you know what? BLUR REDUCTION IS ABSOLUTELY USELESS AND RATHER RUINS EVERYTHING THAN HELPING. It's 100%, absolutely not for gaming (unless you are playing Sims 4). And reality is there are no blur reduction at all.
BR just doesn't work at high speeds.
Yes, if you are playing at comp level, if you are turning fast THERE ARE BLUR with 144Hz@BR ON!
This technology works only when you are looking at slowly moving objects/pictures. Like one in this test http://www.testufo.com/#height=-1&test=photo&photo=quebec.jpg&pps=960&pursuit=0
It moves with 960pixels/sec speed. 960 PER SECOND!!! If your FOV is 106, for example, your 180 will equal 1829 pixels in full HD. With 960/sec pixels speed, it will take 1.9 sec for 180!
With this turning speed you will have no BLUR with 144Hz@BR ON ahahahahahahahaha.
BUT WHO DOES 180 IN ALMOST 2 SECONDS? It's a time enough for few frags.

So, BR is absolutely useless, and adds THESE FANCY BONUSES to your gaming experience:
1) +Input Lag [equal or less than 1000/refresh rate=??ms] (for example at 144Hz it may be up to 6.94444444ms).
2) +crosstalk/ghosting/overshoot
3) +amplified stutters
4) +flicker
WHAT AN AWESOME TIEKNOLOGEEEE
biggrin.gif
MUST BUY! MOM I WONT TO GO PRO!
Also kids may cry about and annoy you with links to blurbustes.com, testufo.com, but the truth is:
Blur Reduction looks great when you're staring at high-contrast content with a clear center of attention (like a red UFO flying across a flat background), but it's pointless when you consider the mixed contrast of your average FPS game scene.

G/Free-syncs are useless for gaming. Absolutely.
Tearing? GET HIGH FPS!
Seriously, if your FPS is 250+ (better 333-500+) you just wouldn't notice tearing.
Kids would tell you about some fluidity and smoothness. Okay, it's truth but don't forget about extra input lag (tiny, but we don't need that) and that you'll be forced to cap your FPS to 144. For me 144 is not enough, higher FPS = less input lag. (1000/FPS=??ms). If you have capped 144fps it means you have 6.94444444ms of input lag. With 500FPS it would be just 2ms. Huge difference, isn't it?

So, all this "TIEKNOLOGEEEZ" are just marketing. Same as 12000DPI in mice (interpolated).
But we are smarter than their marketing departments.

Now lets talk about monitors and how to choose the right one.
The situation is so bad I'm afraid It would be really difficult to get a descend one.
What you need to look at? It's easy:
1) Low input lag and response time
2) Low amount/no ghosting (overshoot %)
3) Picture quality (contrast ratio, color temperature, color accuracy). Contrast ratio is more important than other parameters, because you can't do anything with poor contrast, for example, but poor colors may be calibrated.

OVERALL:
I don't trust BenQ after 2 defective monitors in a row.
PHILIPS has extremely poor build quality (can't get it exactly horizontal, for example, because of weak stand).
AOC has poor contrasts and it's owned by TVP that owns part of PHILIPS..
Eizo/ASUS - using the same panels but just more expensive. And there are many defective 2K IPS monitors from ASUS.
ACER has mostly good reviews, but for some reason it's EXTREMELY EXPENSIVE in my city ACER XB240HA (144Hz 24" TN) costs same as ASUS MG278Q and BenQ XL2730Z (that are 144Hz 27" 2K). This is really strange..
And also there are DELL 144Hz but I know nothing about them and don't think they are highly differs from others.

This problem is really huge. This is not about price segment, as we can see, expensive monitors are bad too.
Getting ASUS MG279Q you have pretty high chance of getting backlight bleeding https://www.youtube.com/watch?v=eB3RlrFCbGk with a WQHD 2560x1440 IPS 144Hz 4ms for $650. What a shame.
Or this fancy new $1150 ACER Z35 PREDATOR@200Hz that can't work@200Hz because of poor response time. What a shame again. http://www.tftcentral.co.uk/reviews/acer_predator_z35.htm
And do you think they engineers did not know that their product wouldn't be able to work normally at 200Hz? Then knew it! Acer Z35 is a child of marketing. MOM 200 HZ! But such a low response time, using everything higher than 120Hz not really practical. So, honestly this is a 120Hz monitor.

It's so ugly it even becomes funny. Pay so much@get chineese ..... Nice.

My next try is Iiyama GB2788HS-B1 and it's promising to be one of the best monitors for competitive gaming, test results are really good: http://us.hardware.info/productinfo/327127/iiyama-g-master-gb2788hs-b1/testresults

-1200:1 contrast ratio
-awesome response time
-good native colour temperature

The only thing is need to be done is calibrating colors, but it's not an issue at all.
And also their monitors usually adjustable to fully get rid of ghosting.

There are also Iiyama GB2488HSU and it's great in all terms, except of contrast (~700:1). It's a shame, it might be the best of the best monitor on the market if it'll have at least 800-850:1.
BUT, there are new revision came out:
Allow me to represent. GB2488HSU-B2 (second, improved revision of GB2488HSU-B1):

1350:1 contrast ratio
top level responsibility
+ good color temp/color accuracy out of the box
FOR THE SAME PRICE!
http://us.hardware.info/productinfo/329880/iiyama-g-master-red-eagle-gb2488hsu-b2/testresults

I'll test my Iiyama 27" in few days and let you know, guys, is it any good. Iiyama is our last hope
biggrin.gif
(and LG24GM77 if you can find it in your city)

All 3 in one table: http://us.hardware.info/comparisontable/1569384/geen_username/monitors-comparison-table-7
 
See less See more
2
#3 ·
Now I know you're trolling.

Ever use a CRT before?
it's the exact same blur reduction used.

I used a CRT exclusively (even though I had a 120hz LCD on standby in reserve) until 2012, when the CRT completely broke beyond repair so I know all about how blur reduction works. Lightboost on the VG248QE gave me the same CRT quality motion blur reduction I had been using since the 90's, but the color temp would skyrocket over 10k and the contrast got destroyed. Benq blur reduction was just like the CRT except much better colors than lightboost was, and it's FAR more adjustable. Lower input lag (the tradeoff for no input lag added is more crosstalk from a higher strobe phase setting), blur reduction works at almost every possible refresh rate also.

Yes there is crosstalk. that's why you don't use blur reduction at 144hz, ever. Crosstalk covers half the screen.

You only use blur reduction with vertical total tweaks to lower the crosstalk, and these VT tweaks only work at 128 hz and below (it doesn't work at 76hz; random out of range error).

At 125hz refresh rate, use VT 1497
At 120hz refresh rate, use VT 1500
At 100hz refresh rate, use VT 1500 or 1502
At 91 hz refresh rate, use VT 1498 (Call of duty games with a 91 fps cap)
At 85 hz refresh rate (black ops 3 @ 85hz, 85 fps glass smooth), use VT 1501 (no other VT will work smoothly)

These VT tweaks should apply to any monitor using the Mstar 8556T scaler, including the XL2420Z and XL2411Z and XL2430T.

All of this information is in my blurbusters thread which you conveniently refused to read over before posting this.

On the XL2720Z, *ALL* excessive overdrive ghosting (purple trailing inverse artifacts) can be REMOVED if blur reduction is off by using the profile / AMA toggles (this requires a combination of the AMA low blur reduction on toggle+ a profile switch to blur reduction off). I already mentioned various times here and on the blurbusters forum you've been spamming a lot lately on and the overdrive made near perfect.

On the XL2720Z, most inverse ghosting with blur reduction on can be removed by using an AMA toggle after enabling blur reduction. This works best if contrast is lowered

I already explained this in multiple threads.

Maybe blur reduction simply isn't for you, but I've gamed since 1994 on the internet. Blur reduction (CRT type strobing tech) has made the single MOST significant enjoyment boost in my games out of anything. Upgrading to new hardware (much more expensive) comes after.

Gsync and Freesync improve the gaming quality for players who can not maintain framerate=refresh rate.
The main problem with ULMB is it does not increase voltage to the backlight, this is why 1.0ms persistence is so dark on ULMB monitors. Lightboost and Benq blur reduction all increase voltage to the backlight by about 1.8x when enabled. Not sure why this was removed from ULMB monitors.

I'm not replying to your posts anymore.
 
#5 ·
I like being able to read text while scrolling. So Benq Blur Reduction is permanently on for me. Works with 144hz as well. Best of both worlds.

edit: oh and it does remove the blur in fast spins in gaming, unless it's broken for you or something...

Maybe some software issue with enabling ULMB? For Benq Blur Reduction, it's simply a monitor setting. The GPU doesn't even know it's enabled. So it can't mess with the GPU either.
 
#6 ·
Quote:
Originally Posted by MadjinnSayan View Post

I dunno man, pc games aren't limited to fast twitch shooters, maybe you should try other game genres to see a difference.
This needs to be said to almost every gamer who sets foot in OCN's monitors and displays forum.

With that being said, blur reduction does make a big difference visually. It may not impact your ability to shoot people in games, this will be a debatable topic. But even when moving my 8200 DPI mouse fast, it's clear how much less blur there is with blur reduction on. Will it get you a better K/D compared to not using it? Maybe, maybe not. Does it look a lot more realistic? Yes. Thus I deem it worthwhile. I can never play Shadow Warrior without blur reduction again. On that note, I do seem to perform better with blur reduction enabled in this game. And for immersive gaming it's also incredible.

Also, blur reduction has never amplified any stuttering for me. Flicker with 120 Hz or 144 Hz strobing won't be noticeable to most people or even most gamers.

You didn't provide any real arguments against variable refresh rate. Your comment there was straight up trolling. Both of these technologies will be impressive to most gamers of any kind. I've introduced all sorts of gamers to them and they all want at least one of these technologies now. I can never accept a display without both unless it's OLED, which is worth sacrifices.
 
#7 ·
Quote:
Originally Posted by boredgunner View Post

It just doesn't impact your ability to shoot people in games.
Oh it does, if you play twitch shooters. There was this one guy and his KDA going up in CoD or BF or something from when he started using lightboost (an older way to force blur reduction on supported monitors before it was cool).

Only if you play strategic games like CS:GO, would it not matter too much, as the game's not about twitch mechanics so much.

But yeah, I doubt many quake pros will tell you that CRT (which has inbuilt blur reduction) is worse than LCD... or something. I never got to know that scene.

edit: long story short, if your monitor's blur reduction isn't as sharp in rapid motion as your CRT, then the blur reduction implimentation got botched. Seems to happen more often than one might expect... (LCD Blur Reduction can end up highlighting the transitory state between 2 frames, on large parts of the screen, if the timing is done poorly. Need to read up on the model if that happens or not. Mine's pretty perfect on that aspect, at least at 120hz and up. Lower frequency seem to make this problem more common/severe, for some reason?)
 
#8 ·
Quote:
Originally Posted by Tivan View Post

Oh it does, if you play twitch shooters. There was this one guy and his KDA going up in CoD or BF or something from when he started using lightboost (an older way to force blur reduction on supported monitors before it was cool).

Only if you play strategic games like CS:GO, would it not matter too much, as the game's not about twitch mechanics so much.

But yeah, I doubt many quake pros will tell you that CRT/functional blur reduction is worse than LCD...
I suppose it will vary person to person. I'll switch that comment to "may not." I definitely perform better in Shadow Warrior with ULMB on.
 
#9 ·
Nightmaster47:

I honestly don't know.

I was one of the last pure CRT holdouts until I simply couldn't FIND any more CRT's to buy and had to give up.

I used CRT's EXCLUSIVELY and FPS games since 1994 and didn't move to my first LCD until 2012. It wasn't until the Lightboost hack came out for the VG248QE that I had the old CRT feel back again, but with worse colors. The only way I ever felt that Lightboost was WORSE than my old CRT's was when the framerate dropped BELOW the refresh rate with. It seemed as if CRT's had some sort of natural blurring effect (phosphor decay?) that helped mitigate the sharpness of framerate stutters when the FPS dropped under the refresh rate (not talking about the vsync FPS / 2 without triple buffering). I noticed the stutters were more annoying on the Lightboost monitor and the same applies to benq blur reduction.

if my FPS is equal to the refresh rate at all times with benq blur reduction, then it's *EVERY BIT* as smooth as my old CRT's.
And before you question my credibility:

I used:
1) some old Zeos CTX monitor on my first non commodore PC (486 DX2)
2) some Viewsonic 19" (I think?) back in 1999 after the CTX died completely.
3) I think a Viewsonic 21" until it died and then I moved to a Dell P1130 (sony Trinitron rebadged, forgot what model).
4) Samsung T220 120hz after the Dell broke too much. COULD NOT STAND IT. the blurriness of 120hz @ 120 fps on the LCD felt WORSE than 60 fps 60hz did on any of the CRTs. YUCK.
5) VG248QE with lightboost (starting in 2012 maybe).
6) Benq blur reduction.

Benq blur reduction is EVERY BIT AS GOOD AS THE OLD CRT's, except for overdrive artifacts and crosstalk (CRT's had zero as their response times were in the nanoseconds).

Crosstalk can be mitigated by using Vertical total tweaks (they don't work at 144hz, so avoid using blur reduction at any refresh rate higher than 128hz).

Ghosting can be mitigated by several firmware bugs and enabling AMA high *AFTER* blur reduction is enabled. Ghosting with blur reduction off can be 95% reduced to a almost flawless level. Ghosting with blur reduction on can be reduced greatly but not fully eliminated (strobing exposes what's left of ghosting much more clearly, which is obvious).

Please read this thread again and take your time.
Don't expect to understand all of this in a day.
Give it a week and I think you'll be happy with the monitor.

http://forums.blurbusters.com/viewtopic.php?f=13&t=2590

First of all, if you' are turning fast, you WILL get blur. CRT"s operated the SAME WAY
This is why some people used 160hz-180hz refresh rate at 800x600.
The reason is, the higher the refresh rate on a CRT, IF your FPS can keep up with it, the faster you can turn without losing blur detail.
You can see this simply with 60 FPS 60 FPS vsync on vs 120 FPS 120hz vsync on.

Second, you probably don't have the strobe duty (persistence) set low enough, and I don't know if you're using VT tweaks properly.
please read my thread.

For the XL2720Z to function like a CRT, to emulate a *FAST* CRT, you need 1.0ms persistence (Strobe duty=006 IF a VT tweak is used. if a VT tweak is NOT used, 1.0ms persistence is based on the current refresh rate; you can sort of look at this table here:
http://display-corner.epfl.ch/index.php/BenQ_XL2411Z

But at 144hz refresh rate, 1.0ms persistence is equal to strobe duty 014-015, because 0.069 (refresh rate persistence of 6.9ms divided by 100) x 14 (14 strobe duty) = 0.966 to 1.035.

So for the XL2720Z to emulate a fast CRT, you need 1.0ms persistence with a VT tweak active (strobe duty =006) which may be too dark, even at 100 brightness (it's fine for me though). For the XL2720Z to emulate a slower CRT, you need 2.0ms persistence, which is as you guessed, strobe duty 012 with a VT tweak.

that's all I can tell you. Everything else is in this thread. literally-everything. Everything I know about this monitor.

http://forums.blurbusters.com/viewtopic.php?f=13&t=2590
 
  • Rep+
Reactions: boredgunner
#10 ·
About G/Free-syncs.

Kids would tell you about some fluidity and smoothness. Okay, it's truth but don't forget about extra input lag (tiny, but we don't need that) and that you'll be forced to cap your FPS to 144. For me 144 is not enough, higher FPS = less input lag. (1000/FPS=??ms). If you have capped 144fps it means you have 6.94444444ms of input lag. With 500FPS it would be just 2ms. Huge difference, isn't it?
 
#12 ·
Quote:
Originally Posted by nightmaster47 View Post

About G/Free-syncs.

Kids would tell you about some fluidity and smoothness. Okay, it's truth but don't forget about extra input lag (tiny, but we don't need that) and that you'll be forced to cap your FPS to 144. For me 144 is not enough, higher FPS = less input lag. (1000/FPS=??ms). If you have capped 144fps it means you have 6.94444444ms of input lag. With 500FPS it would be just 2ms. Huge difference, isn't it?
You would never maintain 500FPS to have the 2ms improvement, it's illogical. You're looking at something like a 144-300 fps range with everything on low, even on older source engine games. Newer games, you can forget about even breaking 200fps, you will be lucky to break 144+. And using an uncapped FPS does make your mouse more responsive, but it's also provides zero consistency, as your mouse response is constantly changing. You will be walking in an idle situation with no action at 300 fps, then hit an action/fight segment and your FPS will drop by 100. A capped FPS not only gives you a smoother image, but it also gives you a more consistent feel to your input.
 
#13 ·
Quote:
Originally Posted by gene-z View Post

You would never maintain 500FPS to have the 2ms improvement, it's illogical. You're looking at something like a 144-300 fps range with everything on low, even on older source engine games. Newer games, you can forget about even breaking 200fps, you will be lucky to break 144+. And using an uncapped FPS does make your mouse more responsive, but it's also provides zero consistency, as your mouse response is constantly changing. You will be walking in an idle situation with no action at 300 fps, then hit an action/fight segment and your FPS will drop by 100. A capped FPS not only gives you a smoother image, but it also gives you a more consistent feel to your input.
Well said.
 
#15 ·
Quote:
Originally Posted by gene-z View Post

You would never maintain 500FPS to have the 2ms improvement, it's illogical. You're looking at something like a 144-300 fps range with everything on low, even on older source engine games. Newer games, you can forget about even breaking 200fps, you will be lucky to break 144+. And using an uncapped FPS does make your mouse more responsive, but it's also provides zero consistency, as your mouse response is constantly changing. You will be walking in an idle situation with no action at 300 fps, then hit an action/fight segment and your FPS will drop by 100. A capped FPS not only gives you a smoother image, but it also gives you a more consistent feel to your input.
1k-2k fps is normal in Osu! ...

But yeah, for most games, no.

CS:GO is a clear winner at 300fps though. Somehow the input feels very sluggish capped to 144fps for me. Not sure what's up with that.

Then there's MMOs where it doesn't matter too much whether your fps is 30 or 100. (from a gameplay perspective; Freesync/Gsync sounds good there!)
 
#17 ·
Quote:
Originally Posted by mtcn77 View Post

I think OP is a digitalversus subscriber which is OK in my opinion. Their Iiyama review has the highest rating as I recall.
smile.gif
I'm not a subscriber/sellouter of any kind. I'm just a guy who wants some good 144Hz monitor.
I'll review Iiyamas 8-9FEB, if they are crap, I'll tell about it.
Also on hardware.info and few other sites Iiyama's results are great too.
But who cares? Reviews often lying. Especially about BenQ that are pure ch1neese crap. Awful colors and contrast. Awful quality. Bur so much advertising OMG PROS USING IT!11 INTEL EXTREME MASTERS OFFICIAL MONITOR OMG! GO PRO BOU BYENKIU!111
Good products don't need that much ads.
Quote:
Originally Posted by gene-z View Post

You would never maintain 500FPS to have the 2ms improvement, it's illogical. You're looking at something like a 144-300 fps range with everything on low, even on older source engine games. Newer games, you can forget about even breaking 200fps, you will be lucky to break 144+. And using an uncapped FPS does make your mouse more responsive, but it's also provides zero consistency, as your mouse response is constantly changing. You will be walking in an idle situation with no action at 300 fps, then hit an action/fight segment and your FPS will drop by 100. A capped FPS not only gives you a smoother image, but it also gives you a more consistent feel to your input.
NO and no. You are wrong.
I and many other pros/semipros feels difference between 144 and 400-500 in CSGO and other games (UT2k4, Q, UT3, Reflex, and any game that can run at 500FPS+ w/o issues)
https://www.youtube.com/watch?v=hjWSRTYV8e0

It's obvious as gravity. More fps = less input lag.

And you can get stable 300-400+ even with 1 videocard easily, graphics on very low (only silvers playing at high) and powerful GPU+CPU. For me it's I7-3770@4500 and 780TI@1100 And you don't need powerful PC, if you are playing low res/4:3.
 
#18 ·
still, i wouldn't want to play all my games including single player ones at nintendo-64 graphic settings just to get some fluidity or something, IMO if you're only going to talk about competitive use of these monitors you should change to title of your thread to 'Whole truth about 144Hz monitors and blur reduction/syncing technologies for competitive gaming' or something, so it doesn't feel like a clickbate when we read the first post
 
#20 ·
Quote:
Originally Posted by MadjinnSayan View Post

still, i wouldn't want to play all my games including single player ones at nintendo-64 graphic settings just to get some fluidity or something, IMO if you're only going to talk about competitive use of these monitors you should change to title of your thread to 'Whole truth about 144Hz monitors and blur reduction/syncing technologies for competitive gaming' or something, so it doesn't feel like a clickbate when we read the first post
Yeah but those kinds of people aren't even aware that other types of gaming exist.
 
#21 ·
First of all, I don't even think the original poster bought the XL2720Z.
If he did, he would have read my tweaks and would be VERY happy with the results after doing them.
I can tell by his completely wrong comment about blur reduction--meaning if he DID buy it, he was too lazy to read this thread:

http://forums.blurbusters.com/viewtopic.php?f=13&t=2590

Second--I know why CS players don't use blur reduction.
it's because 1) they don't usually like vsync at all because they don't like the 6.9ms (144 hz) / 8.3ms (120hz) / 10 ms (100hz) of input lag response time it adds, and playing with sample and hold doesn't help that cause. 2) they don't know HOW to tweak blur reduction to add NO input lag compared to blur reduction off.

The first problem is Lightboost mode back on the pre Benq blur reduction 3d vision 2 monitors.
Lightboost works best if you keep FPS equal to refresh rate, which is of course done with vsync enabled.
Since people think vsync is the Devil's son, you already have a problem to begin with.

Then, Lightboost, due to the way it works--to maintain the lowest amount of strobe crosstalk, uses two things to do this:

1) accelerated scanout, through the monitor's LC panel, instead of through the video card, which extends the blanking interval on the monitor side internally (this gives the panel more time to complete a pixel color transition during a strobe, which lowers the frame "strobe crosstalk" of the next frame being superimposed on the current frame (actually to be precise, it's the current frame being mixed with the next frame):

2) adds 1 frame of input lag (6.9 ms added at 120hz refresh rate):
this is done because to minimize the strobe crosstalk at 120hz, the strobe happens during the **NEXT** frame, not the current frame. If you do a lightboost test at 120hz on a lightboost compatible monitor and google chrome (or any browser which can display 120hz 120 fps vsync on in HTML5, which internet exploder can not do), in full screen mode:

http://www.testufo.com/#test=photo&photo=alien-invasion.png&pps=960&pursuit=0&height=-1

If you look at the very bottom of the screen, you willi notice the beginnings of a double image where two frames start blending together.
What you are actually seeing is at the very bottom, the beginnings of the CURRENT frame's data, while 90% of the rest of the screen (almost all of it) has the NEXT frame's data.

I'm explaining this for a reason.

When Benq made benq blur reduction, they actually reverse engineered the lightboost tech for their own purposes.
Proof of this is on this website, and noticing that the strobe wires serve the same function.

http://display-corner.epfl.ch/index.php/BenQ_XL2411Z

It's also worth noting that both Benq blur reduction and Lightboost increase the backlight voltage current by 1.8x to compensate for the loss of brightness from strobing. The reason why ULMB is MUCH darker than benq blur reduction (or lightboost) at the same pulse width is because ULMB DOES NOT DO THIS. This was first noted in the VG248QE ULMB prototype boards that went out, where people said it was much darker, but had better colors.

The fact that BBR And Lightboost both increase voltage by 1.8x shows that the same base tech is being used. However Benq blur reduction originally did NOT use accelerated scanout at all, while Lightboost did. But Benq, to lower input lag as much as possible, used a default Strobe phase of "100" on the original V1 version firmware monitors, and this was not changeable. They did this for two reasons:

1) to keep the top of the screen as crosstalk free as possible (like Lightboost). Since the top of the screen is more important in counterstrike and FPS games than the bottom of the screen, this was a decision based on logic.

2) to prevent any input lag penalty from using blur reduction.

the problem was, Lightboost used Accelerated scanout in hardware (this required a custom video card refresh rate so the monitor kicked into 3D mode, which activated the new timings), but this also came at an image quality performance loss --e.g. faint horizontal scanlines which would often appear around the top right of the screen, since the panel was being run out of normal specifications. But this reduced the amount of crosstalk massively, so 3D vision would look much better.

Since Benq blur reduction was supposed to be seamless and not require any special hardware to use, accelerated scanout did not exist for Benq blur reduction. Therefore, their default strobe phase of "100" had another drawback---Crosstalk would cover a VERY LARGE part of the bottom of the screen! At 144hz, the entire bottom half of the screen had crosstalk.

Benq, due to complaints about the crosstalk, and for a request for people to be able to adjust how much blur reduction were available, made a new firmware which would allow adjusting the "strobe phase" as well as the pulse width (persistence). Strobe phase of 000 would have lower crosstalk than strobe phase 100, however the drawback of strobe phase 000 was you would have one frame HIGHER INPUT LAG than strobe phase 100.

Note that Lightboost also used a strobe phase (internally) of 000.

So what's with the accelerated scanout?

Well, the monitor's own scaler would have to be able to handle this.
With Lightboost, this was done via a LC panel timing change.

However as I said, Benq blur reduction was a MODIFICATION to Lightboost's strobing by Benq to make strobing work without requiring Nvidia hardware and to allow full monitor OSD adjustments. But the same monitor scaler was used of course (obviously), so a bug existed which people used to their own benefit:

Lightboost internally changed the vertical blanking timing to the monitor to reduce the strobe crosstalk.
However you can do the same thing by increasing the vertical total via custom resolution.
Lightboost did the same thing to the monitor scaler, that a custom Vertical total of 1497-1502 did for Benq blur reduction.
It was the EXACT Same thing. the only difference is Lightboost did this internally through the monitor while Benq blur reduction accepted this due to a bug, since the monitor scaler received the SAME TIMINGS when Lightboost was activated !

By doing a comparison test: I was able to determine that Benq blur reduction at 120hz refresh rate and a Strobe duty of "009" (1.5ms persistence) WITH a vertical total tweak of VT 1500 (without this VT tweak, the persistence of strobe duty 009 would be 0.083 x 9 = 0.747 ms, instead of 0.167 x = 9 = 1.5), and a Strobe phase of "000", both Lightboost (tested at 120hz unlocked lightboost on my XL2720Z) and Benq blur reduction had the EXACT SAME crosstalk amount to the PIXEL.

Just to make sure I wasn't full of crap, I hooked up my Asus VG248QE 24" monitor, unlocked Lightboost, ran test UFO alien invasion at 120hz full screen...and sure enough....EXACT SAME AMOUNT OF CROSSTALK TO THE PIXEL as Benq blur reduction with strobe phase 000, strobe duty 009 and 120hz and Vertical Total 1500 tweak active.

Again however, the penalty of using a strobe phase 000 is 1 frame higher input lag.

So.....how do you solve that?

You do what Benq ORIGINALLY had suggested in their old V1 monitors==you use a strobe phase of 100.
However when using a VT tweak, the backlight will *SHUT OFF* at strobe phase 100, because you wind up trying to strobe into the frame BEFORE the current frame (which doesn't exist). The math for all that shenanigans is explained completely in this thread:

http://forums.blurbusters.com/viewtopic.php?f=13&t=2590

Basically, the current refresh rate and strobe persistence limits the maximum strobe phase allowed. When a VT tweak is active, the normal persistence values get "forced" into the 60hz persistence values (60hz persistence values are equal to the 60hz REFRESH RATE persistence -- 16.7 milliseconds) divided by 100 = .167 per point of strobe duty.

Usually with benq blur reduction, the persistence values depend on the refresh rate divided by 100, at ALL REFRESH RATES, so if you look at this chart:
http://display-corner.epfl.ch/index.php/BenQ_XL2411Z

you can see that the persistence values for 144hz are 0.069 ms, and for 120hz are 0.083ms, which are the refresh rate persistences divided by 100.

But when you use a VT tweak, these persistences get FORCED into the 60hz values. This LIMITS the maximum strobe phase from 100 (normally) to a lower value, all which I listed in my thread.

So--to get ZERO ADDED INPUT LAG from benq blur reduction, while MINIMIZING the amount of strobe crosstalk (remember you will ALWAYS have more strobe crosstalk at a high strobe phase than strobe phase 000, no matter what), you need to RAISE The strobe phase UP TO THE POINT WHERE THE BACKLIGHT SHUTS OFF, then DROP IT by 1. this will be equal to 0.167ms of strobe persistence--which is too dim to be usable (the maximum strobe duty will be 001; higher values won't work). Then you KEEP dropping the strobe phase by 1 (while raising the strobe duty) until the brightness becomes acceptable to game with, so you balance strobe crosstalk + no added input lag + how dim you want the screen to be.

For example: Here are my Call of Duty black Ops 3 settings:

1920x1080, 85hz refresh rate
Vertical Total VT 1501
Strobe phase 064
Strobe duty 006/007 (1.0ms persistence).

Strobe phase higher than 071 shuts off the backlight.

If I used a strobe phase of 000< I'd have more input lag -- 11.7 ms higher input lag--the same thing those CS players complained about.

if those CS pros knew what I knew about how Benq blur reduction worked, people wouldn't have complained about strobing so much.

NOTE: THE XL2730Z DOES NOT RESPOND TO VT TWEAKS. AREA (strobe phase=100) still has no added input lag but crosstalk is TOO HIGH.

if anyone here read all of this and actually understood what I'm saying---MAD PROPS and good job. You're more intelligent and patient than 90% of the adult population. Put it to good use. I doubt most people followed this.

AND NO--THIS WAS NOT A COPY AND PASTE. I WROTE THE ENTIRE DAMN MESSAGE OVER A HALF HOUR OF FREAKING TYPING.
 
  • Rep+
Reactions: boredgunner
#23 ·
Quote:
Originally Posted by nightmaster47 View Post

NO and no. You are wrong.
I and many other pros/semipros feels difference between 144 and 400-500 in CSGO and other games (UT2k4, Q, UT3, Reflex, and any game that can run at 500FPS+ w/o issues)
https://www.youtube.com/watch?v=hjWSRTYV8e0
So, what exactly am I wrong about? I literally said there is an improvement in input response when using an uncapped frame rate in my reply and you're just repeating that to prove I'm "wrong"?

Quote:
It's obvious as gravity. More fps = less input lag.
Quite possibly the most idiotic example I've read in my entire life. Literally laughed out loud when I read that.

Quote:
And you can get stable 300-400+ even with 1 videocard easily, graphics on very low (only silvers playing at high) and powerful GPU+CPU. For me it's I7-3770@4500 and 780TI@1100 And you don't need powerful PC, if you are playing low res/4:3.
My 970 and i5 must be faulty then, because I can't hold a stable 300-400+ in any FPS games. My 970 drops to 160-220 on all low on source engine games in heavy combat/fighting. I can easily break 300+ sitting in an idle room though.
 
#24 ·
You lost my interest completely when you said "gsync / freesync is useless for gaming, you should play at 300fps". Let me guess, the only game you play is CS:GO?

I don't like blur reduction anyway, it makes image quality worse (except for blur), flickers and does not work with gsync, that is enough to make it useless on a gsync monitor.
 
#25 ·
Quote:
Originally Posted by Falkentyne View Post

First of all, I don't even think the original poster bought the XL2720Z.
Why are you lying?

I absolutely clearly said about this poor quality BenQs:

1) 1st I bought was 2411Z and this as a result:
http://i.imgur.com/gq48iVM.jpg
http://i.imgur.com/xV9rIRG.jpg

2) Then I bought BY YOUR ADVICE 2720Z and it has:
http://i.imgur.com/LKCPThb.jpg
A) Not working touch-buttons (2 out of 5)
B) Backlight bleeding.

I don't bother myself about replying to a "person" that bases its conclusions on imagination (exapmle: I don't even think the original poster bought the XL2720Z). YES I BOUGHT AND MONEYBACKED IT THE SAME DAY.
So get lost with your filthy lying thoughts BenQ fanboy. Are you already on salary in BenQ PR department? :DDD
 
#26 ·
Whatever. you pissed me off. Putting you on my IGNORE list and NEVER talking to you again, creep. ENJOY your broken 2 buttons. BYE.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top