Overclock.net › Forums › Intel › Intel Motherboards › Gaming and mouse response BIOS optimization guide for modern PC hardware
New Posts  All Forums:Forum Nav:

Gaming and mouse response BIOS optimization guide for modern PC hardware - Page 2

post #11 of 3207
wow didnt know that sandy bridge doesnt support hyper-threading why would intel do that. I do have an HT cpu tho and after disabling core parking i noticed that interrupt to process latency spikes dont occur while i'm playing dota2 or it is minimized to task pannel (dont know how to say that ) . When it was enabled i had 1.5ms spikes, now its 0.25ms
post #12 of 3207
Thread Starter 
updated original post:

If you own an LCD with no built in scaler (like the Korean IPS ones) and use an Nvidia driver such as 267.59 to try and avoid the huge input lag from the newer drivers forcing you to use "GPU-No Scaling" instead of "Display-No Scaling" (yes, the lag is in native resolution without even scaling), do not install Internet Explorer 10 or optional windows update KB2670838 ( http://support.microsoft.com/kb/2670838 ). IE 10 installs that update automatically, and if you use older Nvidia or ATI drivers with that update, (mostly pre-Win8 era ones), you get blue screen page faults like the picture below:

post #13 of 3207
post #14 of 3207
Thread Starter 
Quote:
Originally Posted by Tazzzz View Post

What do you think about upcoming g sync, roach http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

It infuriates me. They completely ignore their increasing driver input lag, then they release a proprietary, add-on solution similar to PHYSX to fragment the market that nobody thinks is a good idea except people that also through PHYSX was a good idea.

There's no real explanation for how it magically removes the giant amount of input lag their drivers have accumulated. They probably don't acknowledge they have any at all, then just claim they are removing scaler lag or something.

Fixing the vsync chop/stutter sounds doable with the hardware solution they're talking about, but I'd like to see LOTS more info on just how they are attempting to deal with input lag. Not fixing input lag for customers without a "g sync monitor" is also a huge slap in the face.
post #15 of 3207
Its probably gonna increase input lag due to needing time for synchronization(but not like vsync) and lower monitor refresh rate when fps gets low, instead of throwing frames in as fast as gpu and monitor can display.
lol at fixing stuttering and heres SS3 dev AlenL about stuttering:
Quote:
The issue (or absence thereof) that you are noticing, Vicarious, is most likely what is deemed "microstuttering". This is something that is caused by overzealous optimizations on the GPU driver side, and the whole affair started somewhere around ten years ago, IIRC.

This problem was apparent for nearly a decade now. But it was only earlier this year that we realized the actual cause of this: What most people see, in many games is that it is not that the framerate is not good, but that the game engine doesn't get proper feedback from the driver about what that framerate actually is. So, eg. you can have perfect 60fps (with Vsync on!), but the game gets the feedback that frames are rendered at 100fps and 45fps alternatingly, or at 65 fps for like ten frames and then one of 30, or any weird combination like that (I'm mostly making up these numbers now we've seen all kinds - anything is possible).

In all those cases, the average FPS is 60, but individual frames are timed wrong. Since the game has to prepare animations/physics etc for each successive frame based on how long the last frame lasted, having this reported wrong is what causes the animation to look jerky, even though it is rendering smoothly. It's a kind of a catch-22 - the animation is prepared for jerky framerate, and if the framerate really _was_ jerky, the animation would actually look fairly well. But since the framerate is perfect, the animation looks jerky.

We've been nagging IHVs about this, but the solution is not so simple.

In any case, earlier this year, we've patched SS3 to include a workaround (actually more of a hack) that tries to detect and alleviate such behavior of the driver. It is not perfect, but it helps for a lot of people. Like you, I suppose.
from
http://steamcommunity.com/app/41070/discussions/0/846965882771158437/
post #16 of 3207
Quote:
Originally Posted by r0ach View Post

It infuriates me. They completely ignore their increasing driver input lag, then they release a proprietary, add-on solution similar to PHYSX to fragment the market that nobody thinks is a good idea except people that also through PHYSX was a good idea.

There's no real explanation for how it magically removes the giant amount of input lag their drivers have accumulated. They probably don't acknowledge they have any at all, then just claim they are removing scaler lag or something.

Fixing the vsync chop/stutter sounds doable with the hardware solution they're talking about, but I'd like to see LOTS more info on just how they are attempting to deal with input lag. Not fixing input lag for customers without a "g sync monitor" is also a huge slap in the face.

Yeah, your optimized PC will have less input lag than their g-sync one for sure cause they are dealing only with one part of input lag that comes from fps.

Probably people start asking why they have to play with 30 fps after paying thousands of dollars for their "4k" rig biggrin.gif

I assume theoretically they can make 144 fps on 144 hz panel perform better lag-wise then 300 fps or 500 fps on the same 144hz panel without vsync. They just need their g-sync chip to produce very small amount of its oun latency and i wonder if display port has the same latency as duallink dvi or not. At this point it can be very tempting as you wouldnt need more then 144 fps at all, that would be less power consumtion less heat and noise for games that used to pull more then 144fps.

I dont know what to buy now smile.gif as i wanted to buy 290x and a new monitor for competitive csgo but it seems to be hot as hell and i really dont like that. Cant do your driver manipulation aswell cause csgo needs kepler driver to run well and that the main game i play.
Hell, these features really shouldn't be proprietary mad.gif
Edited by Tazzzz - 10/19/13 at 5:58am
post #17 of 3207
Quote:
Originally Posted by r0ach View Post

it's taken me a loooooooong time to track down the effects each one has on the actual user experience and not just benchmark scores
So you admit that you gauge effects of these tweaks "by feel"?
post #18 of 3207
Why force vertical sync off and note let application decide?
Also off threaded optimization, why?
post #19 of 3207
Quote:
Originally Posted by Glymbol View Post

Quote:
Originally Posted by r0ach View Post

it's taken me a loooooooong time to track down the effects each one has on the actual user experience and not just benchmark scores
So you admit that you gauge effects of these tweaks "by feel"?

There's really no other method. You have to rely on people with brains that work better than yours. I can feel a faint difference with different driver versions, but I'm never quite sure if I'm imagining things or not. But sometimes there's a massive difference and I feel there's really no arguing when that happens.

For example, I'm currently sitting at a Linux desktop and the mouse feels a lot better. I'm pretty convinced there's a difference. Is that real? How would I prove this?

The best method I can think of, at first I would need a video camera that can do a lot more than normal 24fps or 60fps. I would have to set up a shot that shows both my hand moving and the screen. I can't do that. I don't have that hardware, I don't know if that kind of camera even exists for a normal price, and it would also be a good amount of work.
post #20 of 3207
Yes its nothing we can prove without special cameras and technical knowledge.
I first tried hpet long time ago, having no idea what it will improve and noticed huge response difference right when I booted to desktop. Same thing with C1E and Speedtep when I bought new supposedly faster but very delayed pc. Before I seen all this "placebo no technical explanation/high speed cam proven" fixes by r0ach.
Some things have have no or imperceivable effect and they do no input lag reduction. I cant force myself into a placebo effect or feel more responsive input with 125hz + raw input like someone said it does.

Majority of gamers play with very high and always different sensitivity(most cant even turn 180 degrees with one swift motion), play poor console ports without 1:1 movement, games with major input lag problems and find nothing strange about it, and think "that its how it supposed to be". Some may even take and believe in insulting "weighted weapon movement".

Same thing when source engine server has constant drops to 5-10fps (or stupid 30000 rate servers with constant choke for everyone), 95% of people just say they have "no lag must be ur pc/connection lol", even thought there is a thing that shows servers fps. Or not noticing extremely desynced melee animation in some source engine games.
Or lcd vs crt war: people denying and defending with "no lag for me", "human reaction is 200ms" bs to the bitter end.
And they also are clueless to much things, same thing was for me when I first started gaming years ago. I just didnt know what was happening and how things supposed to run and respond, but I didnt have good internet connection to write "placebo" on forums.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Intel Motherboards
Overclock.net › Forums › Intel › Intel Motherboards › Gaming and mouse response BIOS optimization guide for modern PC hardware