That's some pretty amazing insights, F u r u y Ã¡!
1. It's a "Razer's" 3G sensor, though firmware is a unknown factor.
2. They show how easily (@900mm/s) even 16bit interfaces* meet their limits @500Hz exhibiting clipping as a consequence.
3. They show 1000Hz (at least on this sensor/firmware) tends to be more prone to jitter/regular skipping/inaccuracy in the milisecond range (500Hz must be time-averaging them out).
4. Nice graphs
*AFAIK mice nowadays have a 16bit interface, at least mouse-to-PC wise (USB) which fits the calculation here:
log(897/25.4*1600)/log(2) = 15.786 (a bit weird since having two directions would need another bit ^^)
However, Bullveyr has mentioned before that most sensors employ an 8bit-interface, I guess mouse-to-firmware that is. I don't know the details but common sense suggests that the sensor then needs to be polled by the firmware/MCU with at least twice the USB polling rate, right? Or is it just polling at the specified frame rate? Any insights on that are welcome
A script is great as well! That kinda motivates me to publish my Octave script
I am glad that F u r u y Ã¡! didn't let himself get scared away. After all, misunderstandings are bound to happen when our common base, the Enlish language, is not everbody's first (not mine either
The logger method is easy and available but it has to be kept in mind (as outerspace himself mentioned) that it only reports what the mouse/sensor "sees" which is only indirectly correlated to reality i.e. physical movements.
So despite their can't-be-easily-tried-at-home-nature, the ESR results and the like should be expected to be more accurate as to the mice's ability to digitalize physical movement but at the same time they give less detailed insight on the implementation (e.g. polling rate fluctuations or similar).
I just hope and think that the results of both, the "physical" and the "digital" approach, deliver close enough results that distinction via comments to the cells of Skylit's sheet are sufficient