Originally Posted by metal571
I got a lot of flak for saying use whatever CPI you want on another thread. What you should do is use the lowest you can get away with, because the sensor is "most accurate" that way. This is coming from some of the biggest experts left on this forum so I trust them. I play on 400 CPI now at 70cm/360. If 400 isn't enough and you are moving more than one pixel per count from the mouse when you move the mouse very slowly, try 800. Some people say 1600 is "native" but really all steps on the mouse are native up to 5000.
Originally Posted by Necroblob
Out of interest why did people say low dpi was more accurate? There are a couple of specific situations in which a lower dpi is better: (a) the game doesn't have rawinput so high dpi can lead to negative acceleration (b) the mouse in question performs better on its lower dpi steps. However, I didn't think it was a general rule?
Up until A3090 (included), all optical sensors had one or two native steps, and the rest would be interpolated or discarded counts, which would usually produce artifacts like jitter, pixel walk, angle snapping, or have a lower malfunction speed.
Thus, the settings that yielded better accuracy and consistency used to be the lowest native CPI settings, also providing extra flexibility, since they allowed a number of different sensitivities.
With the newest batch of sensors, the rule of thumb would be to get the CPI higher (but still low enough so it doesn't cause any kind of tracking error) to have more angular granularity in-game, as your sensitivity would be lower.
However, it depends (a lot) on the implementation of the sensor on whether this should be good practice or not.
A sensor that trades image detail for more captures at higher CPI might feel more responsive but less detailed, and on low CPI it might feel more nervous, itchy and noisy but follow your movements more accurately (yet less responsively), which is, I guess, what Falkentyne described on the G502 thread.