Overclock.net › Forums › Components › Mice › The first real test for measuring input lag!
New Posts  All Forums:Forum Nav:

The first real test for measuring input lag!

post #1 of 76
Thread Starter 
A user called ewh on ESR aka qsxcv on oc.net, has just come up with the first proper test for input lag that I've seen. I figured there's enough mouse weirdos here who would be interested in checking out what he has to say biggrin.gif. Just a copy and paste, not my work, ewh is the guy whose come up with this great method. Roach will be partying like its 1999, we can now test clown cursor™, swamp cursor™, and other kinds of mouse disease! tongue.gif

The link to the original post(which ewh has said he will be updating) is here, hope this is ok http://www.esreality.com/post/2691945/microsecond-input-lag-measurements/


EWH's MICROSECOND INPUT LAG MEASUREMENTS


helloooo biggrin.gif

tl;dr: I can measure full-chain (usb input to screen response) input lag with ~10 microsecond precision and accuracy. let me know what you want me to measure.

history:

Initial discussion about this was on my thread in blurbusters (flood's input lag measurements), but seeing how much interest there is in noacc's thread, and how some people are requesting additional measurements/testings for various settings, I thought I'd post here as well.

Anyway, inspired by the various measurements around the internet of input lag performed using a cheap high-speed camera, I embarked to replicate these with my own such setup. So I got a casio ex-zr700, which is capable of 1000fps at tiny resolution, and started taking apart my g100s... and managed to accidentally short and fry its pcb frown.gif. As I did not want to buy another one and take it apart, attempt to solder, etc..., I decided to forgo the button-click-to-gun-fire measurements, and instead simply measure motion lag by slamming my dead g100s onto my new g100s, and seeing how many video frames it took to see a response on my screen. The results were quite promising and I could measure with precision and accuracy around 1-2ms, limited by the fact that i recorded at 1000fps, and that it is difficult to determine the exact frame where the mouse begins movement. But it was just tedious slamming the mice together and counting video frames.

So I got an LED, a button, and an Arduino Leonardo board, which is capable of acting as a USB mouse, to automate the mouse slamming. With it, I can just press a button to make the "cursor" instantly twitch up to 127 pixels in any direction, and at the same time, an LED would light up. This made things a lot easier when scrolling through the video frames, but still it was quite tedious and I never made really more than 20 measurements from a single video clip.

A few days ago, Sparky, in the blurbusters thread, suggested I use a photodetector on the arduino to replace the video camera and thus automate the measurements. At first, I was reluctant and doubted that it would allow more precision than the high speed cam, but soon I realized that due to CRT's low persistence and the phosphor's fast rise time, actually it would work very well... so a few days of messing with electronics led to this thing:

current microsecond measuring setup:
last updated 2014 dec 9

hardware:



arduino code:
http://pastebin.com/DvgkM9Ab

(anyone good with electronics and/or arduino, please let me know if there's anything to be improved/fixed)

how it works:
When I press the button, the program begins taking measurements. It takes ~10 measurements a second.

Each measurement is done by staring at a dark part in-game, twitching the cursor so that the screen goes to a bright part, waiting and measuring how long it takes for the photodiode to response, and twitching the cursor back.

There is a bit of randomness in the spacing between each measurement so that the measurements aren't happening in sync with the framerate, which would likely give biased results that are consistently higher or lower than the average of a several independent measurements.

Here is all data so far:

https://docs.google.com/spreadsheets/d/1cktehkalPAbJ5014jL-Vtg3YmGXmNTnAZy5CAQ-Zpkk/edit?pli=1#gid=0

preliminary conclusions:
default fps_max 100 adds between 0-10ms of input lag (expected behavior) over the lag with uncapped framerate of ~2000fps
raw input doesn't affect cs1.6

If you want be to test anything, let me know here...

Things I have available for testing:


lots of knowledge of physics
basic knowledge of electronics

a 1000fps camera (casio ex-zr700)
an arduino leonardo + various simple electronics for interfacing with photodiodes

a z87 computer, i7 4770k @ 4.5ghz
an x58 computer, i7 920 @ 3.6ghz
a gtx 970 (nvidia reference model from best buy)
a gtx 460 768mb (galaxy brand)
intel hd4000 lol
windows 7, 8.1, xp (if you want), linux (need to update 239562 things tho)

two crt monitors (sony cpd-g520p, sony gdm-fw900)
2 lcd monitors (old ****ty viewsonic tn from ~2009, asus vg248qe with gsync mod)

a laptop (thinkpad x220, 60hz ips screen)

a logitech g100s
a logitech g3
a ninox aurora (which will come. one day.)

I'm okay with buying more hardware as long as it's reasonable. e.g. I'm not going >$30 on things which I won't end up using for something else.

FAQ:

1. How much does the arduino setup cost?
~$25 arduino leonardo
~$10 breadboard + jumpers
~$5 for three types of resistors and one capacitor (which probably isn't necessary)
~$5 for some photodiode
~$5 for some single supply op amp and a switch/button.
total ~$50 USD

note: this setup doesn't work as well on LCDs due to persistence. The photodiode, unless you set some ridiculous gain, will only respond once a significant portion of the screen is lit. Whereas for a CRT, due to the fast rise time of the phosphors, as soon as 5 or so rows are lit, the photodiodes will get a response as strong as when the entire screen is lit. so if you want to replicate my stuff, you should get some cheap crt.


2. How can you measure lag less than the refresh period? i.e. how do you measure < 5ms at 60hz???
CRTs are rolling scan, which means except in the vblank period (~10% of the refresh cycle), the screen is constantly updated from top to bottom. Since the photodiode is placed in a way that it is sensitive to changes in any part of the screen, this means that the input lag of any event is unrelated to refresh rate, so long as the framebuffer update doesn't occur during the vblank interval.


3. If the computer and arduino is perfect, with the only limitation being that the game runs at XYZ fps, how much input lag would be measured?

(frame rendering time) <= input lag <= (frame rendering time) + (time between frames)

picture explanation: http://i.imgur.com/9cSP1bM.png

(frame rendering time) is the time taken to run the cpu+gpu code and is equal to the the inverse of the uncapped framerate

(time between frames) is the actual time between when frames start to get rendered and is equal to the inverse of the actual framerate.

So for instance, if my game runs at 2000fps uncapped but I cap it to 100fps, I expect, at the very best, to see a uniform distribution of input lag between 0.5ms and 10.5ms

If my game runs at 50fps uncapped and I don't cap it, I expect, at best, to see a uniform distribution of input lag between 20ms and 40ms.


4. How does mouse polling rate affect input lag?

still need to figure this out. but from first principles it must be that there is at least between 0 and 1/polling rate of input lag, since input can occur anywhere in between polls. depending on the mouse firmware, it could be more


5. How much input lag can I feel? what does 10ms of input lag feel like?
try for yourself with this: http://forums.blurbusters.com/viewtopic.php?f=10&t=1134

My personal threshold is around 10-16ms. One osu player managed to barely pass the test for 5ms!!!11!1!11


6. What amount of input lag is 100% guaranteed to be insignificant in that no human can feel it, and no one's performance will be affected at all?

This is not something easy to measure since amounts of lag that you can't feel in a blind test could still affect performance... but I believe it is between 1 and 5ms. I'd guess that anything less than 2ms is absolutely insignificant.

One thing that I keep in mind is that in quite a few top-level cs:go lan tournaments, the monitors used were eizo fg2421's which are documented to have ~10ms more input lag than other 120+ hz tn monitors such as the asus vg248qe, benq xl24**z/t, etc... And no one was suddenly unable to hit shots or anything. But they all played against each other on that monitor, so who knows tongue.gif

Another thing is that when tracking objects on a low persistence monitor, it is possible to detect (with your eyes) the difference between a setup with constant input lag and a setup with input lag that fluctuates by +/-1ms. see http://www.blurbusters.com/mouse-125hz-vs-500hz-vs-1000hz/ . But I don't know of any game/situation where this affects performance.

TODO, in order of priority:
measurements:
csgo measurements
correlating data with 1000fps video
nvidia driver versions
swapping graphics cards and/or computers
other games (quake live, reflex)
bios setting, hpet, raw input, whatever mythbusting

setup:
use a faster function than micros(), which only has 4us precision and takes ~3us to run
wire cutters to trim resistors lol
make my own usb interface from the arduino pins...
bigger photodiode and faster op amp?
Edited by QLsya - 12/9/14 at 6:42am
post #2 of 76
https://docs.google.com/spreadsheets/d/1ckteh...1486951641 not working
give us full link asap
thx for the work

nvidia driver versions for your newer card, should be #1 on list, best test, most users affected
344.11 vs .16 .70 etc.. 320.44
also
will be funny laugh to r0ach things
Edited by thizito - 12/9/14 at 5:52am
post #3 of 76
Thread Starter 
Quote:
Originally Posted by thizito View Post

https://docs.google.com/spreadsheets/d/1ckteh...1486951641 not working
give us full link asap
done.
post #4 of 76
ewh/flood here
Quote:
Originally Posted by thizito View Post


nvidia driver versions for your newer card, should be #1 on list, best test, most users affected
344.11 vs .16 .70 etc.. 320.44
also
will be funny laugh to r0ach things
as much as i want to laugh at some of the ridiculous claims (ram latency lol), there is always the possibility that for his particular set up, there actually is a perceptable difference in input lag when switching a particular setting.

but still if you just look through that optimization thread, the presence of confirmation bias and placebo-like effects is undeniable tongue.gif and even with evidence there's this lovely fact about human nature:http://arstechnica.com/science/2014/12/why-do-we-cling-to-beliefs-when-theyre-threatened-by-facts/
main
(15 items)
 
old
(14 items)
 
 
CPUMotherboardGraphicsRAM
4770k maximus vii impact nvidia gtx 970 crucial ballistix tactical 16gb 
Hard DriveCoolingOSMonitor
crucial mx100 noctua nh-c14 windows 7 ultimate sony cpd-g520 
KeyboardPowerCaseMouse
kbp v80 matias quiet silverstone sx500-lg ncase m1 v3 logitech g100s with mcu replaced by teensy2.0 
Mouse PadAudioAudio
allsop raindrop xl chord mojo hifiman re-600 
CPUMotherboardGraphicsRAM
i7 920 evga x58 sli le galaxy gtx 460 crucial something 3x1gb 
Hard DriveCoolingOSMonitor
intel 330 180gb scythe kotetsu windows 8.1 pro sony cpd-g520 
KeyboardPowerCaseMouse
logitech k120 silverstone st75f-gs nxzt h440 evga torq x5 
Mouse PadAudio
allsop raindrop mobo 
  hide details  
Reply
main
(15 items)
 
old
(14 items)
 
 
CPUMotherboardGraphicsRAM
4770k maximus vii impact nvidia gtx 970 crucial ballistix tactical 16gb 
Hard DriveCoolingOSMonitor
crucial mx100 noctua nh-c14 windows 7 ultimate sony cpd-g520 
KeyboardPowerCaseMouse
kbp v80 matias quiet silverstone sx500-lg ncase m1 v3 logitech g100s with mcu replaced by teensy2.0 
Mouse PadAudioAudio
allsop raindrop xl chord mojo hifiman re-600 
CPUMotherboardGraphicsRAM
i7 920 evga x58 sli le galaxy gtx 460 crucial something 3x1gb 
Hard DriveCoolingOSMonitor
intel 330 180gb scythe kotetsu windows 8.1 pro sony cpd-g520 
KeyboardPowerCaseMouse
logitech k120 silverstone st75f-gs nxzt h440 evga torq x5 
Mouse PadAudio
allsop raindrop mobo 
  hide details  
Reply
post #5 of 76
Thread Starter 
I think r0ach is a good guy, I always enjoy reading his posts, some of his views seem far fetched, but I don't know why people go on a hate campaign with the guy. A world without r0ach posts would be a sad world smile.gif
post #6 of 76
The first thing you would want to test is obviously prerender setting of 3 vs 2 vs 1 (reboot after changing this setting was required on Win7, not sure about 8) on a 60hz monitor to see if your device even works properly at all because we know what results should come from that. The real question is, since Windows isn't a real time OS, can this test even be done accurately on a fresh Win 8.1 install that hasn't been stripped down?

What if search indexer or some other crappy Windows routine is doing something in the background for one test but not the other? Many Windows services like "portable device enumerator" start on system boot then turn off later. Others like "application experience" seem to turn on and off at random. These services constantly going on and off is going to throw a wrench into the results.

As for other settings, I think we can all agree settings like HPET on/off have a large effect on mouse movement, but can the difference actually be easily quantifiable with a simple stat like input lag? What if it only shows less than 1ms difference in mouse lag between HPET on and off? Are you going to claim everyone that notices a huge difference with HPET on/off are crazy? Maybe the setting is not quantifiable by input lag, but I'd like to see HPET tested.

What if the BIOS changes IRQs, or alters USB or other system properties anytime you hit save in the BIOS even if you don't change any settings? How do you deal with things like that? You will probably need to run several control tests to see how/if things change if all you do is raise CPU voltage by 0.01v and hit save.

Also, I wouldn't be surprised if you can't get accurate results at all for any test unless you turn HPET off, then use a program like this to set timer resolution to 1.0 or 0.5:

http://www.lucashale.com/timer-resolution/

Kind of hard to measure the difference in HPET on/off if you need it off to get accurate results right?

As for RAM timing settings, if you change a setting like ram speed from 1600 to 1866, you're changing memory divider from 100:100 to 100:133 on Ivy Bridge. Some things like that might not be easily quantifiable by input lag measurements. The divider running at a setting that's not 1:1 might cause something like more chop to happen, which might make the cursor control differently with vsync off without being hugely noticeable in input lag measurements.

Here's a list of things I would like to see tested, but not until you can actually prove your methodology works by testing the prerender/flipque as mentioned in first post. The one clown on ESReality that said he had undeniably accurate input lag measurements and claimed turning Nvidia scaling off increased input lag didn't even bother doing control experiments with flipque to determine if his device even worked.

I would also do all testing with HPET off and 1.0 or 0.5 timer resolution as well as mentioned above. I think tests should be divided into two categories: 1) tests done where the BIOS is not entered or touched and only Windows settings are changed, and 2) Tests done where only BIOS settings are changed.

Windows only things to test (first set HPET off and 1.0 or 0.5 timer resolution, preferably use Win 8.1 and don't enter BIOS again till all tests are done

1) Human interface device access service on/off

2) Windows Defender enabled/disabled - (disable it by typing "windows defender" in search and on the last tab of the program, uncheck "use this application"

3) Input lag with 0 apps open vs having Internet Explorer open vs Firefox vs Chromium (chromium portable, not chrome). You should also try with hardware acceleration disabled on each browser. I can easily tell IE has the least lag, Chromium 64 has the most, and Firefox is in the middle if you want to have a browser open while gaming but I wouldn't personally have any of them open myself.

4) Adobe flash installed vs uninstalled. Reboot after installing and uninstalling because it will cause other Windows Services to come on and off in the process of adding and removing it.

5) Print spooler enabled/disabled

6) Windows error reporting set on/off for all users, not just a single user account, click the all users tab

7) Superfetch service on/off

8) Nvidia virtual audio enabled/disabled in device manager

9) Nvidia HDMI audio enabled/disabled in device manager

10) Nvidia "Display - No Scaling" vs "GPU - No Scaling". You will need to reboot after changing the setting. You may even need to reinstall the driver after changing this setting because it can bug out mouse movement.

11) MSI mode vs line based interrupt mode (it's line based by default) for the Nvidia GPU

Only do the following two after the above 10:

11) See if input lag changes by unplugging then replugging in the mouse while the PC is on. Restart the PC, see if it changes again.

2) Power down the PC, plug in a mouse like G400, power up and test lag, hit DPI up or down switch and test again to see how on the fly DPI changes mouse movement and see if it's quantifiable by lag.


BIOS only changes - only test these after all the above has been tested

I do not believe you will get easily reproducable results from any of this because I believe the BIOS changes a lot even if you change 0 settings and hit save and exit:

1) HPET on/off

2) Memory strap 100:100 vs 133:100

3) PWM phase control set to auto vs all phases turned on (extreme)

4) Hyperthreading on vs off

5) USB 3 (xHCI) enabled vs disabled (don't install the USB 3 driver, just test with it on/off in BIOS)

6) PLL Overvoltage on vs off
Edited by r0ach - 12/9/14 at 8:32am
post #7 of 76
The r0ach has spoken. In for results
post #8 of 76
Subscribed, this is an amazing setup. Very clever.
The Sig Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
i7 2600K @ 4.xghz MSI Z68A-GD65 (G3) MSI GTX 970 Gaming SLI @ 1504mhz core/8020mhz m... 16GB DDR3 2x 8gb @ 1866mhz C8 
Hard DriveHard DriveOptical DriveCooling
240GB Sandisk Extreme A few 2tb hdd's. DVDRW Corsair H75 CLC on the CPU 
OSMonitorKeyboardPower
Windows 8.1 Pro 64-bit ACER B326hk 4k 60hz 3840x2160 IPS sst display Coolermaster Cherry MX Brown switch mechanical ... Corsair TX 950w 
CaseMouseMouse PadAudio
Fractal Design Define R3 Black Pearl Logitech MX518 EverGlide Titan "Medium" Soundblaster Z PCIE 
  hide details  
Reply
The Sig Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
i7 2600K @ 4.xghz MSI Z68A-GD65 (G3) MSI GTX 970 Gaming SLI @ 1504mhz core/8020mhz m... 16GB DDR3 2x 8gb @ 1866mhz C8 
Hard DriveHard DriveOptical DriveCooling
240GB Sandisk Extreme A few 2tb hdd's. DVDRW Corsair H75 CLC on the CPU 
OSMonitorKeyboardPower
Windows 8.1 Pro 64-bit ACER B326hk 4k 60hz 3840x2160 IPS sst display Coolermaster Cherry MX Brown switch mechanical ... Corsair TX 950w 
CaseMouseMouse PadAudio
Fractal Design Define R3 Black Pearl Logitech MX518 EverGlide Titan "Medium" Soundblaster Z PCIE 
  hide details  
Reply
post #9 of 76
I'm interested in seeing quantifiable results.
post #10 of 76
Quote:
Originally Posted by shatterboxd3 View Post

I'm interested in seeing quantifiable results.

That's the problem, every few months we see posts like this where some guy claims to have invented a new infallible method to test lag, then what happens next is, they either disappear, never to be heard from again, or they post on ESReality saying something like turning HPET off increases lag by 5 seconds so lol@r0ach which everyone knows isn't true. For some reason, the people that claim to have a perfect method of testing input lag never do what I suggested about testing prerender 3 vs 2 vs 1 to even figure out if their method works or not.

For measuring anything that requires intricate CPU side measurements, you're going to need HPET off and timer resolution at 0.5 or 1.0 as well.
Edited by r0ach - 12/9/14 at 10:36am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Mice
Overclock.net › Forums › Components › Mice › The first real test for measuring input lag!