Originally Posted by CPate
There is a correct way, people just keep ignoring me.
I'm not ignoring your posts at all. As a matter of fact, your two comments in this thread exactly concur with my understanding."Ultimately the goal is to improve accuracy, and you do that best by using the purest data possible. Without throwing away information in software."
"Cranking the DPI to max and turning down your sensitivity is much less accurate than leaving your sensitivity at default and adjusting DPI to suit your play style."
After all, this is what I said in the other thread."...if in-game sensitivity is touched from neutral, I think it means raw data from mouse is altered(amplified/deamplified) by software. This sounds like it defeats the purpose of raw input feature. Then, perhaps, it is best to leave in-game sensitivity alone at default and adjust mouse DPI/CPI values accordingly."
Now, the problem with this approach,
By this approach, I mean...
- Windows sensitivity at default 6/11
- Enhanced pointer precision off (unchecked)
- in-game mouse raw input enabled
- in-game sensitivity at default (untouched)
- mouse at native CPI setting (no interpolation)
In case of my Sensei (Avago 9500 sensor), it has wide range of "native" CPI (in steps of 90).
For example, Counter Strike: Global Offensive has default sensitivity of 6.0
If I leave the in-game sensitivity untouched, and adjust CPI accordingly, CPI setting on my Sensei would be around 200 CPI (180 or 270 CPI to be exact in steps of 90) for CS:GO. Doesn't playing at 180 and/or 270 CPI sound weird to you ?