Originally Posted by Timecard
Isn't DPC the software layer abstraction for queued interrupt requests?
And what with it? All i know of it: that cpu gets interrupts from hardware and than schedules DPC call. But higher the latency, than the longer time for processor to handle interrupts. You get frame each 16.67, or 6.9 ms etc. But what if interrupt is handled right after a frame is finished? Than it doesn't render last mouse position in that frame and you'll have to wait for a next frame to update another mouse position. If you have 500hz polling e.g. Than if mouse captures position each 2 ms. E.g. 2ms before gpu finishes rendering a frame (i doubt mouse position can be updated once frame is sent to monitor from gpu) and than on top of that: if you have high dpc. latency - it can take like another 100 us on good motherboards to handle interrupt from usb. Even usb doesn't send interrupts, but cpu polls it 500/1000 times per second. CPU still schedules dpc calls for USB driver... But than, if cpu handles interrupt 100us: after a frame is finished rendering and is sent to monitor. Than gpu had rendered mouse position from time 4ms ago, because new mouse position wasn't updated yet!!! Cpu prepares each frame for GPU, before GPU starts to render it, i don't know if cpu can update mouse position to gpu, once gpu has a frame. But after gpu sends it to monitor, it is doubtful. Than you can see over 1 second of time, mouse position can lag each time up to 144 times, if you use 144hz monitor. And small input lag adds up!
It doesn't affect you, that much, because you have 200 ms reaction time on average and if on top of overall input lag. 2.6 ms is added, it won't be major a difference. But you can feel it on overall mouse movement and especially: when you do small adjustments and mouse movement is clealer! Also movement becomes more stable, than if you have high dpc. latency. It is called micro stuttering!
That's why if you set timer resolution to 0.5ms, there is gigantic difference between 1ms. It allows to update code to cpu faster. Yes it is only 500us difference, but than it means: it can update code to cpu each 500us, which scales over 1 second of time, because you have 144fps e.g. Than if you could send code to cpu only each 1ms, or at higher interval... You can try it yourself. There is gigantic difference! Also if you will be using 500hz, instead 1000hz. If mouse position renders from previous frame (4ms ago), there will be larger feeling that mouse lags. That if it used latest position from (2ms ago). So even if you have 100 us dpc. latency and there is 144 fps. So there is chance 144 times: mouse cursor can be rendered not from a last position! Depending how polling occurs and dpc. latency, before interrupt is handled by cpu.
I am not hardware expert, but if there wasn't any difference, than i couldn't feel anything. You can try timer resolution 500us vs 1ms, there is so gigantic difference, that anyone can probably tell. A lot of people use it. Also you can try 15.6ms, vs 0.5us and you have to disable dynamic tick, so timer resolution is constant! Also it increases cpu load slightly. It was tested in Crysis to add fps.
Also about stability: if you use 0.5ms it feels more unstable, than 1ms. It is hard to describe, but 0.5 ms is more inconsistent than 1ms, especially with 500hz polling. With 1000hz polling, it will be better. That's reason i was always using 1ms timer resolution, feels best! And 500hz also better than 1000hz. Because 1000hz captures even smallest movements... So 500hz is more accurate, but less consistent!