The first real test for measuring input lag! - - An Overclocking Community

Forum Jump: 

The first real test for measuring input lag!

Thread Tools
post #1 of 76 (permalink) Old 12-09-2014, 05:38 AM - Thread Starter
New to
QLsya's Avatar
Join Date: Jun 2014
Posts: 76
Rep: 15 (Unique: 14)
A user called ewh on ESR aka qsxcv on, has just come up with the first proper test for input lag that I've seen. I figured there's enough mouse weirdos here who would be interested in checking out what he has to say biggrin.gif. Just a copy and paste, not my work, ewh is the guy whose come up with this great method. Roach will be partying like its 1999, we can now test clown cursor™, swamp cursor™, and other kinds of mouse disease! tongue.gif

The link to the original post(which ewh has said he will be updating) is here, hope this is ok


helloooo biggrin.gif

tl;dr: I can measure full-chain (usb input to screen response) input lag with ~10 microsecond precision and accuracy. let me know what you want me to measure.


Initial discussion about this was on my thread in blurbusters (flood's input lag measurements), but seeing how much interest there is in noacc's thread, and how some people are requesting additional measurements/testings for various settings, I thought I'd post here as well.

Anyway, inspired by the various measurements around the internet of input lag performed using a cheap high-speed camera, I embarked to replicate these with my own such setup. So I got a casio ex-zr700, which is capable of 1000fps at tiny resolution, and started taking apart my g100s... and managed to accidentally short and fry its pcb frown.gif. As I did not want to buy another one and take it apart, attempt to solder, etc..., I decided to forgo the button-click-to-gun-fire measurements, and instead simply measure motion lag by slamming my dead g100s onto my new g100s, and seeing how many video frames it took to see a response on my screen. The results were quite promising and I could measure with precision and accuracy around 1-2ms, limited by the fact that i recorded at 1000fps, and that it is difficult to determine the exact frame where the mouse begins movement. But it was just tedious slamming the mice together and counting video frames.

So I got an LED, a button, and an Arduino Leonardo board, which is capable of acting as a USB mouse, to automate the mouse slamming. With it, I can just press a button to make the "cursor" instantly twitch up to 127 pixels in any direction, and at the same time, an LED would light up. This made things a lot easier when scrolling through the video frames, but still it was quite tedious and I never made really more than 20 measurements from a single video clip.

A few days ago, Sparky, in the blurbusters thread, suggested I use a photodetector on the arduino to replace the video camera and thus automate the measurements. At first, I was reluctant and doubted that it would allow more precision than the high speed cam, but soon I realized that due to CRT's low persistence and the phosphor's fast rise time, actually it would work very well... so a few days of messing with electronics led to this thing:

current microsecond measuring setup:
last updated 2014 dec 9


arduino code:

(anyone good with electronics and/or arduino, please let me know if there's anything to be improved/fixed)

how it works:
When I press the button, the program begins taking measurements. It takes ~10 measurements a second.

Each measurement is done by staring at a dark part in-game, twitching the cursor so that the screen goes to a bright part, waiting and measuring how long it takes for the photodiode to response, and twitching the cursor back.

There is a bit of randomness in the spacing between each measurement so that the measurements aren't happening in sync with the framerate, which would likely give biased results that are consistently higher or lower than the average of a several independent measurements.

Here is all data so far:

preliminary conclusions:
default fps_max 100 adds between 0-10ms of input lag (expected behavior) over the lag with uncapped framerate of ~2000fps
raw input doesn't affect cs1.6

If you want be to test anything, let me know here...

Things I have available for testing:

lots of knowledge of physics
basic knowledge of electronics

a 1000fps camera (casio ex-zr700)
an arduino leonardo + various simple electronics for interfacing with photodiodes

a z87 computer, i7 4770k @ 4.5ghz
an x58 computer, i7 920 @ 3.6ghz
a gtx 970 (nvidia reference model from best buy)
a gtx 460 768mb (galaxy brand)
intel hd4000 lol
windows 7, 8.1, xp (if you want), linux (need to update 239562 things tho)

two crt monitors (sony cpd-g520p, sony gdm-fw900)
2 lcd monitors (old ****ty viewsonic tn from ~2009, asus vg248qe with gsync mod)

a laptop (thinkpad x220, 60hz ips screen)

a logitech g100s
a logitech g3
a ninox aurora (which will come. one day.)

I'm okay with buying more hardware as long as it's reasonable. e.g. I'm not going >$30 on things which I won't end up using for something else.


1. How much does the arduino setup cost?
~$25 arduino leonardo
~$10 breadboard + jumpers
~$5 for three types of resistors and one capacitor (which probably isn't necessary)
~$5 for some photodiode
~$5 for some single supply op amp and a switch/button.
total ~$50 USD

note: this setup doesn't work as well on LCDs due to persistence. The photodiode, unless you set some ridiculous gain, will only respond once a significant portion of the screen is lit. Whereas for a CRT, due to the fast rise time of the phosphors, as soon as 5 or so rows are lit, the photodiodes will get a response as strong as when the entire screen is lit. so if you want to replicate my stuff, you should get some cheap crt.

2. How can you measure lag less than the refresh period? i.e. how do you measure < 5ms at 60hz???
CRTs are rolling scan, which means except in the vblank period (~10% of the refresh cycle), the screen is constantly updated from top to bottom. Since the photodiode is placed in a way that it is sensitive to changes in any part of the screen, this means that the input lag of any event is unrelated to refresh rate, so long as the framebuffer update doesn't occur during the vblank interval.

3. If the computer and arduino is perfect, with the only limitation being that the game runs at XYZ fps, how much input lag would be measured?

(frame rendering time) <= input lag <= (frame rendering time) + (time between frames)

picture explanation:

(frame rendering time) is the time taken to run the cpu+gpu code and is equal to the the inverse of the uncapped framerate

(time between frames) is the actual time between when frames start to get rendered and is equal to the inverse of the actual framerate.

So for instance, if my game runs at 2000fps uncapped but I cap it to 100fps, I expect, at the very best, to see a uniform distribution of input lag between 0.5ms and 10.5ms

If my game runs at 50fps uncapped and I don't cap it, I expect, at best, to see a uniform distribution of input lag between 20ms and 40ms.

4. How does mouse polling rate affect input lag?

still need to figure this out. but from first principles it must be that there is at least between 0 and 1/polling rate of input lag, since input can occur anywhere in between polls. depending on the mouse firmware, it could be more

5. How much input lag can I feel? what does 10ms of input lag feel like?
try for yourself with this:

My personal threshold is around 10-16ms. One osu player managed to barely pass the test for 5ms!!!11!1!11

6. What amount of input lag is 100% guaranteed to be insignificant in that no human can feel it, and no one's performance will be affected at all?

This is not something easy to measure since amounts of lag that you can't feel in a blind test could still affect performance... but I believe it is between 1 and 5ms. I'd guess that anything less than 2ms is absolutely insignificant.

One thing that I keep in mind is that in quite a few top-level cs:go lan tournaments, the monitors used were eizo fg2421's which are documented to have ~10ms more input lag than other 120+ hz tn monitors such as the asus vg248qe, benq xl24**z/t, etc... And no one was suddenly unable to hit shots or anything. But they all played against each other on that monitor, so who knows tongue.gif

Another thing is that when tracking objects on a low persistence monitor, it is possible to detect (with your eyes) the difference between a setup with constant input lag and a setup with input lag that fluctuates by +/-1ms. see . But I don't know of any game/situation where this affects performance.

TODO, in order of priority:
csgo measurements
correlating data with 1000fps video
nvidia driver versions
swapping graphics cards and/or computers
other games (quake live, reflex)
bios setting, hpet, raw input, whatever mythbusting

use a faster function than micros(), which only has 4us precision and takes ~3us to run
wire cutters to trim resistors lol
make my own usb interface from the arduino pins...
bigger photodiode and faster op amp?
QLsya is offline  
Sponsored Links
post #2 of 76 (permalink) Old 12-09-2014, 05:47 AM
Join Date: Dec 2012
Posts: 362 not working
give us full link asap
thx for the work

nvidia driver versions for your newer card, should be #1 on list, best test, most users affected
344.11 vs .16 .70 etc.. 320.44
will be funny laugh to r0ach things
thizito is offline  
post #3 of 76 (permalink) Old 12-09-2014, 05:56 AM - Thread Starter
New to
QLsya's Avatar
Join Date: Jun 2014
Posts: 76
Rep: 15 (Unique: 14)
Originally Posted by thizito View Post not working
give us full link asap
QLsya is offline  
Sponsored Links
post #4 of 76 (permalink) Old 12-09-2014, 06:20 AM
qsxcv's Avatar
Join Date: Feb 2014
Posts: 4,370
Rep: 373 (Unique: 154)
ewh/flood here
Originally Posted by thizito View Post

nvidia driver versions for your newer card, should be #1 on list, best test, most users affected
344.11 vs .16 .70 etc.. 320.44
will be funny laugh to r0ach things
as much as i want to laugh at some of the ridiculous claims (ram latency lol), there is always the possibility that for his particular set up, there actually is a perceptable difference in input lag when switching a particular setting.

but still if you just look through that optimization thread, the presence of confirmation bias and placebo-like effects is undeniable tongue.gif and even with evidence there's this lovely fact about human nature:

too busy to check forums as regularly
pm me if i forget to respond
(13 items)
(16 items)
7980xe direct die @ 4.4/4.2/4.2 GHz, 1.135V
x299 aorus master
strix 5700 xt oc
4x32gb crucial ballistix 3200cl16
Hard Drive
wd black sn750 1tb
Power Supply
seasonic snowsilent 1050
3x nemesis gts 360mm, fans: gt 1850rpm
lian li o11 dynamic
Operating System
viewsonic xg2703-gs
logitech g410
logitech g pro wireless
4770k @ 4.6GHz
maximus vii impact
nvidia gtx 970
crucial ballistix tactical 16gb
Hard Drive
crucial mx100
Power Supply
silverstone sx500-lg, fan removed
nxzt x53
ncase m1 v3
Operating System
Operating System
win 7 ultimate
sony gdm-fw900
kbp v80 matias quiet
logitech g pro (3366)
allsop raindrop xl
chord mojo
sennheiser hd800
▲ hide details ▲
qsxcv is offline  
post #5 of 76 (permalink) Old 12-09-2014, 06:27 AM - Thread Starter
New to
QLsya's Avatar
Join Date: Jun 2014
Posts: 76
Rep: 15 (Unique: 14)
I think r0ach is a good guy, I always enjoy reading his posts, some of his views seem far fetched, but I don't know why people go on a hate campaign with the guy. A world without r0ach posts would be a sad world smile.gif
QLsya is offline  
post #6 of 76 (permalink) Old 12-09-2014, 07:53 AM
New to
r0ach's Avatar
Join Date: Feb 2012
Posts: 2,438
Rep: 178 (Unique: 117)
The first thing you would want to test is obviously prerender setting of 3 vs 2 vs 1 (reboot after changing this setting was required on Win7, not sure about 8) on a 60hz monitor to see if your device even works properly at all because we know what results should come from that. The real question is, since Windows isn't a real time OS, can this test even be done accurately on a fresh Win 8.1 install that hasn't been stripped down?

What if search indexer or some other crappy Windows routine is doing something in the background for one test but not the other? Many Windows services like "portable device enumerator" start on system boot then turn off later. Others like "application experience" seem to turn on and off at random. These services constantly going on and off is going to throw a wrench into the results.

As for other settings, I think we can all agree settings like HPET on/off have a large effect on mouse movement, but can the difference actually be easily quantifiable with a simple stat like input lag? What if it only shows less than 1ms difference in mouse lag between HPET on and off? Are you going to claim everyone that notices a huge difference with HPET on/off are crazy? Maybe the setting is not quantifiable by input lag, but I'd like to see HPET tested.

What if the BIOS changes IRQs, or alters USB or other system properties anytime you hit save in the BIOS even if you don't change any settings? How do you deal with things like that? You will probably need to run several control tests to see how/if things change if all you do is raise CPU voltage by 0.01v and hit save.

Also, I wouldn't be surprised if you can't get accurate results at all for any test unless you turn HPET off, then use a program like this to set timer resolution to 1.0 or 0.5:

Kind of hard to measure the difference in HPET on/off if you need it off to get accurate results right?

As for RAM timing settings, if you change a setting like ram speed from 1600 to 1866, you're changing memory divider from 100:100 to 100:133 on Ivy Bridge. Some things like that might not be easily quantifiable by input lag measurements. The divider running at a setting that's not 1:1 might cause something like more chop to happen, which might make the cursor control differently with vsync off without being hugely noticeable in input lag measurements.

Here's a list of things I would like to see tested, but not until you can actually prove your methodology works by testing the prerender/flipque as mentioned in first post. The one clown on ESReality that said he had undeniably accurate input lag measurements and claimed turning Nvidia scaling off increased input lag didn't even bother doing control experiments with flipque to determine if his device even worked.

I would also do all testing with HPET off and 1.0 or 0.5 timer resolution as well as mentioned above. I think tests should be divided into two categories: 1) tests done where the BIOS is not entered or touched and only Windows settings are changed, and 2) Tests done where only BIOS settings are changed.

Windows only things to test (first set HPET off and 1.0 or 0.5 timer resolution, preferably use Win 8.1 and don't enter BIOS again till all tests are done

1) Human interface device access service on/off

2) Windows Defender enabled/disabled - (disable it by typing "windows defender" in search and on the last tab of the program, uncheck "use this application"

3) Input lag with 0 apps open vs having Internet Explorer open vs Firefox vs Chromium (chromium portable, not chrome). You should also try with hardware acceleration disabled on each browser. I can easily tell IE has the least lag, Chromium 64 has the most, and Firefox is in the middle if you want to have a browser open while gaming but I wouldn't personally have any of them open myself.

4) Adobe flash installed vs uninstalled. Reboot after installing and uninstalling because it will cause other Windows Services to come on and off in the process of adding and removing it.

5) Print spooler enabled/disabled

6) Windows error reporting set on/off for all users, not just a single user account, click the all users tab

7) Superfetch service on/off

8) Nvidia virtual audio enabled/disabled in device manager

9) Nvidia HDMI audio enabled/disabled in device manager

10) Nvidia "Display - No Scaling" vs "GPU - No Scaling". You will need to reboot after changing the setting. You may even need to reinstall the driver after changing this setting because it can bug out mouse movement.

11) MSI mode vs line based interrupt mode (it's line based by default) for the Nvidia GPU

Only do the following two after the above 10:

11) See if input lag changes by unplugging then replugging in the mouse while the PC is on. Restart the PC, see if it changes again.

2) Power down the PC, plug in a mouse like G400, power up and test lag, hit DPI up or down switch and test again to see how on the fly DPI changes mouse movement and see if it's quantifiable by lag.

BIOS only changes - only test these after all the above has been tested

I do not believe you will get easily reproducable results from any of this because I believe the BIOS changes a lot even if you change 0 settings and hit save and exit:

1) HPET on/off

2) Memory strap 100:100 vs 133:100

3) PWM phase control set to auto vs all phases turned on (extreme)

4) Hyperthreading on vs off

5) USB 3 (xHCI) enabled vs disabled (don't install the USB 3 driver, just test with it on/off in BIOS)

6) PLL Overvoltage on vs off

Mouse Input Lag BIOS & Windows Optimization Guide
too many personal messages to reply to
r0ach is offline  
post #7 of 76 (permalink) Old 12-09-2014, 08:22 AM
Join Date: Jul 2011
Posts: 383
The r0ach has spoken. In for results
ranseed is offline  
post #8 of 76 (permalink) Old 12-09-2014, 08:48 AM
Windows 10 is good
GoldenTiger's Avatar
Join Date: Jul 2010
Location: Visual Studio, USA
Posts: 4,301
Rep: 200 (Unique: 157)
Subscribed, this is an amazing setup. Very clever.

Have a problem with your FUD being dispelled by fact checkers?
"Patients suffering from psychosis have impaired reality testing; that is, they are unable to distinguish personal subjective experience from the reality of the external world." biggrin.gif
GoldenTiger is offline  
post #9 of 76 (permalink) Old 12-09-2014, 09:05 AM
New to
shatterboxd3's Avatar
Join Date: Aug 2014
Posts: 191
Rep: 9 (Unique: 9)
I'm interested in seeing quantifiable results.
shatterboxd3 is offline  
post #10 of 76 (permalink) Old 12-09-2014, 10:33 AM
New to
r0ach's Avatar
Join Date: Feb 2012
Posts: 2,438
Rep: 178 (Unique: 117)
Originally Posted by shatterboxd3 View Post

I'm interested in seeing quantifiable results.

That's the problem, every few months we see posts like this where some guy claims to have invented a new infallible method to test lag, then what happens next is, they either disappear, never to be heard from again, or they post on ESReality saying something like turning HPET off increases lag by 5 seconds so [email protected] which everyone knows isn't true. For some reason, the people that claim to have a perfect method of testing input lag never do what I suggested about testing prerender 3 vs 2 vs 1 to even figure out if their method works or not.

For measuring anything that requires intricate CPU side measurements, you're going to need HPET off and timer resolution at 0.5 or 1.0 as well.

Mouse Input Lag BIOS & Windows Optimization Guide
too many personal messages to reply to
r0ach is offline  

Quick Reply

Register Now

In order to be able to post messages on the - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Please enter a password for your user account. Note that passwords are case-sensitive.
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page

Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off