@ Skylit: I'm not saying the higher the CPI the better, nor do I necessarily want to know what I should be using, I was pointing out exactly what you did; it actually doesn't really matter that much as long as your settings aren't "objectively" out of the norm. So, I was wondering, with using high CPI actually not being that bad as people make it out to be, but actually offering more options to make your sensitivity fit your preference, whatever that might be, why do sponsors not encourage a change of view in this field? How the polling rate affects the actual translation of the sensor's CPI to the desktop is interesting, but it doesn't really explain why 90% of the time, people will prefer low over high CPI...
The native CPI still being 400 on modern mice would be a sound reason of course. But is that actually true? What is the "average" native CPI of mice nowadays? I thought it was 800-1000 in most cases.
@ test user: I said that in most cases, this wouldn't be a problem. I just tied that statement to the fact that with lower CPI, most people are going to use higher in-game sensitivities. And higher in-game sensitivities can
lead to "pixel skipping". And reasons for not "just using 400cpi" besides the "native CPI" argument you also mentioned, is that like I mentioned, higher CPI can feel different (more smooth maybe, however different it may feel, people potentially could prefer it) while not changing the real sensitivity.
Like I said, old-school players might be used to 400cpi mice, but that doesn't explain why generations of players keep using the same.
not to mention that many mice tend to have lower malfunction speeds and/or worse cursor quality with high cpi
That's something. Does this apply to a difference of 400 vs. 800 - 1600cpi though?
and its not like people buy specific hardware today because players are using it, or maybe i just havent seen someone that stupid yet.
It's actually very common (in the CS community at least) for players to base their settings upon what the professionals use. Not necessarily the exact gear (but this isn't as uncommon as you might think aswell), but when someone of their "idols" wouldn't be using the standard 400cpi, the fan wouldn't just go ahead and get a non-customizeable 10 year old mouse, but would have to consider buying more recent models which support higher CPI and maybe also use the manufacturer's software/drivers.
Again, I'm not talking about what is best to use or what should be used or whatever. It's just that the vast majority of professionals (not only in gaming, even in GFX for instance) using low CPI made me think that there is something objectively superior to it. Now that I know that actually, there are no real disadvantages to the opposite and that some people might even prefer the "feel" of higher CPI while maintining the real sensitivity they are used to, I am rather buffled as to how there has never been a change to how this is viewed, especially with manufacturers pushing the resolutive barriers of their sensors and the sponsors assumably taking interest in making the adjustability of their mice's resolution an actual viable aspect of marketing their products (whereas the general opinion now seems to be: higher CPI are completely useless and sponsors are just alluring people who like big numbers into buying their products). As I said, if what this article suggest is true, that would mean: More possibilities of setting up your CPI = more ways of adjusting your sensitivity to fit your preference.