Overclock.net banner
1 - 14 of 14 Posts

killerfromsky

· Registered
Joined
·
1,350 Posts
Discussion starter · #1 ·
Hey,
I've got a weird request/question.

I want to show my girlfriend how input/display lag of a tv/monitor can ruin gaming experiences.
And why we should not cheap out on buying our next television, that will also be used as primary monitor.

Does anyone have any idea how I could lag it out? Or simulate a 60ms lag?
Or how I could use AHK (AutoHotKey) or some similar program to delay every single input/action? Same goes for mouse movements.

Kind regards,
Thomas
 
Discussion starter · #3 ·
Quote:
Originally Posted by jellybeans69 View Post

Let her try playing CS:GO and turn vsync/triple buffering + raw input on
tongue.gif
I play cs:go with vsync/triple buffering afaik, I don't know what raw input means though.
I'm currently on my laptop, and would like to not just show it for gaming, but also regular mouse movements in windows etc.
 
Quote:
Originally Posted by killerfromsky View Post

I play cs:go with vsync/triple buffering afaik, I don't know what raw input means though.
I'm currently on my laptop, and would like to not just show it for gaming, but also regular mouse movements in windows etc.
There is raw input option in CS:GO options try turning it on and it should induce quite a big of mouse lag
tongue.gif

For me it's basically unplayable if i turn it on and afterwards let her try with it off to see the difference.

EC7B25E3D100A8BD270EC2CE3B2D11685E674060

(Taken from google search as i'm at work now)
 
Discussion starter · #5 ·
Quote:
Originally Posted by jellybeans69 View Post

There is raw input option in CS:GO options try turning it on and it should induce quite a big of mouse lag
tongue.gif
That's counter productive, as that option should bypass any windows mouse settings/accelerations? Then it should lower input lag when on?
Thanks anyway, will try it later!
 
Quote:
Originally Posted by killerfromsky View Post

That's counter productive for the
That's counter productive, as that option should bypass any windows mouse settings/accelerations? Then it should lower input lag when on?
Thanks anyway, will try it later!
The whole option is kind of bugged in cs:go as far as i know. Also almost none of pro players use it on "on" either. It induces 1-2 frames lag comparing to it being off(that's especially true if you install 3rd party mouse drivers). Not sure if it will work for you but it should demonstrate input lag for your gf quite nice if you compare both.
 
Quote:
Originally Posted by jellybeans69 View Post

The whole option is kind of bugged in cs:go as far as i know. Also almost none of pro players use it on "on" either. It induces 1-2 frames lag comparing to it being off(that's especially true if you install 3rd party mouse drivers). Not sure if it will work for you but it should demonstrate input lag for your gf quite nice if you compare both.
It actually works ok for me. For low sens gamers it's a must as otherwise you can experience negative acceleration.

As for the topic, I don't really know. I just wanted to say that raw input should theoretically only reduce input lag.
 
Quote:
Originally Posted by gonX View Post

It actually works ok for me. For low sens gamers it's a must as otherwise you can be experiencing negative acceleration.

As for the topic, I don't really know. I just wanted to say that raw input should theoretically only reduce input lag.
I haven't tested it lately as last time i turned it on i got that 1-2 frames "delay"/"smoothing" which made impossible to aim properly turning it off fixed it for me and difference was quite obvious - as i said it should be more apparante if you install 3rd party driver soft like logitech mouse drivers/synapse? Changing mouse polling rate to 125hz / limiting your fps below 100/make sure you're using MSAA also can help get better effect of input lag
I'll test if raw input on / off still feels very different for me (on G400s with log gaming soft installed) once i get home tonight i'll test it again if it's still the same

If you do google search on mouse/input lag in cs:go you see tons of results on that
tongue.gif
 
Discussion starter · #9 ·
Quote:
Originally Posted by gonX View Post

It actually works ok for me. For low sens gamers it's a must as otherwise you can be experiencing negative acceleration.

As for the topic, I don't really know. I just wanted to say that raw input should theoretically only reduce input lag.
Yes, I expected as much. Let's assume for now that option works as intented (as I have no way of testing it) and it doesn't answer my question as it does not generate any input lag.
 
Quote:
Originally Posted by gonX View Post

It actually works ok for me. For low sens gamers it's a must as otherwise you can experience negative acceleration.

As for the topic, I don't really know. I just wanted to say that raw input should theoretically only reduce input lag.
No, low sensitivity does NOT benefit from raw input.
Only if your mouse has too high cpi it can induce negative acceleration.
I have a 55cm/360 and use an intellimouse with it's 1.5m/s PCS with no problem in CS.

Raw input does theoretically reduce input lag, but as it buffers mouse data independently of frame-rate it gives a disconnected feel I find particularly annoying in CS due to it's strafe-and-stop style shooting. Buffering mouse independently from movement I find is a bad Idea, the CS go implementation in particular probably has filtering applied too.

In short it feels like crap and is useless unless you overkill cpi. It's good for bad laggy setups with low and inconsistent fps.

@killerfromsky

If you want to show her input lag just make her play with vsync and without, or better the other way around.
If she can't notice that well.. nice try.

Set maximum pre rendered frames in the nvidia driver to highest value for input lag. ATI couterpart is called flip ques size.

Oh and try wireless mouse vs wired!

Use a non native screen resolution, better if different in aspect ratio.

For windows you could try with aero and screen composition.
Turn on HPET from bios.

Follow this guide, but do the opposite.

If you have a screen with dynamic contrast and/or color correction around try hooking that up.

Or what the hell, grab the first random CRT you find and compare it to a random TFT.

Always remember that there is no such thing as no input lag and every individual has different thresholds for noticing latency.
 
Quote:
Originally Posted by the1freeMan View Post

No, low sensitivity does NOT benefit from raw input.
Only if your mouse has too high cpi it can induce negative acceleration.
I have a 55cm/360 and use an intellimouse with it's 1.5m/s PCS with no problem in CS.
If anything, low sensitivity is the only place where raw input benefits. It depends on your resolution.
Quote:
Originally Posted by the1freeMan View Post

Raw input does theoretically reduce input lag, but as it buffers mouse data independently of frame-rate it gives a disconnected feel I find particularly annoying in CS due to it's strafe-and-stop style shooting. Buffering mouse independently from movement I find is a bad Idea, the CS go implementation in particular probably has filtering applied too.

In short it feels like crap and is useless unless you overkill cpi. It's good for bad laggy setups with low and inconsistent fps.
That sounds pretty subjective. It does reduce input lag regardless.
Quote:
Originally Posted by the1freeMan View Post

Turn on HPET from bios.
HPET increases overall program responsiveness as it's easier for the kernel to schedule correctly.
You get less FPS spikes, and with good drivers, less input lag.
Read THIS.

You also only really have the quick tick rate (2 KHz) available in the kernel with HPET enabled.
Quote:
Originally Posted by the1freeMan View Post

Always remember that there is no such thing as no input lag and every individual has different thresholds for noticing latency.
And this is something everyone should remember.
 
Quote:
Originally Posted by gonX View Post

If anything, low sensitivity is the only place where raw input benefits. It depends on your resolution.
That sounds pretty subjective. It does reduce input lag regardless.
Using raw input or not is preferential, what I mean is that there is a technical reason for people not to like it as well.

Quote:
Originally Posted by gonX View Post

HPET increases overall program responsiveness as it's easier for the kernel to schedule correctly.
You get less FPS spikes, and with good drivers, less input lag.
Read THIS.

You also only really have the quick tick rate (2 KHz) available in the kernel with HPET enabled.
And this is something everyone should remember.
In my experience HPET raises DPC latency and mouse lag.
I believe by how it works with comparators rescheduling interrupts.

HPET is known to have greater overhead.

http://www.postgresql.org/message-id/4EDED63E.2040002@2ndQuadrant.com

From wikipedia: " Red Hat MGR version 2 documentation notes that TSC is the preferred clocksource due to its much lower overhead, but HPET is used as fallback. A benchmark in that environment for 10-million event counts found that TSC took about 0.6 seconds while HPET took slightly over 12 seconds and ACPI PM took some 24 seconds."

Windows kernel might be designed to work with it, but it does have higher DPC latency on all systems I tested.
All win7 machines to make it clear.

Different machines and operating systems seem to work differently with it.
How I see it, if the specific application doesn't have functions that require HPET and TSC is stable I don't find it necessary.

To me and many users the difference in mouse lag is pretty noticeable.
Maybe you get more stable fps with it, but fps at cost of latency is a pretty common trade off I tend to avoid ( pre-rendered frames, sli, etc..)

I am re-testing it myself in CS:GO aim training and I can't get close to my non-HPET scores with it turned on, actually I just beat my personal record with it off..

EDIT: Reading the MSND documentation you posted with more calm actually confirms what I said about HPET's higher latency an overhead compared to invariant TSC.

"Windows 7 and Windows Server 2008 R2

The majority of Windows 7 and Windows Server 2008 R2 computers have processors with constant-rate TSCs and use these counters as the basis for QPC. TSCs are high-resolution per-processor hardware counters that can be accessed with very low latency and overhead (in the order of 10s or 100s of machine cycles, depending on the processor type).
Windows 7 and Windows Server 2008 R2 use TSCs as the basis of QPC on single-clock domain systems where the operating system (or the hypervisor) is able to tightly synchronize the individual TSCs across all processors during system initialization.
On such systems, the cost of reading the performance counter is significantly lower compared to systems that use a platform counter.
Furthermore, there is no added overhead for concurrent calls and user-mode queries often bypass system calls, which further reduces overhead. On systems where the TSC is not suitable for timekeeping, Windows automatically selects a platform counter (either the HPET timer or the ACPI PM timer) as the basis for QPC.
"
 
1 - 14 of 14 Posts