Overclock.net banner
141 - 153 of 153 Posts
I wouldn't be surprised if you were right about ghub having an impact (although it doesn't seem to be a big problem, or more people would have reported on that one). Plenty of times did people find bugs and issues with different software and especially Windows. Like you quoted me, i have tested software in the past seeing the different results in dpc latency and polling behaviour. So i am all up for findings that might help Logitech to improve their software, when they receive feedback.

But you should consider to provide some data with your findings, then people wouldn't have to be smart enough to figure out what you said. They can just look at a chart, and see empirical data that proves your findings (which would be hard to deny for anyone). But trying to sell claims like "g-hub permanently destroys mouse response" without any data to back it up based on feelings alone i can understand some of the skepticism.

It was the same with when the ultra low latency mode got introduced. People instantly claimed it is so much better and others claimed it was so much worse. But both sides were arguing with what they personally perceived without any data to back their claims up. Then Battlenonsense made this video and cleared things up.


This is basically how it should be done.
Yea... And now he made a video saying to raise your DPI to 3200 to lower your input lag. The videos of this guy seems all over the place, some good ones, some REALLY bad ones.

 
Old rule still active, Gaming mouse it should be wired.

I am now after Logitech support team along their software developers, they claim that they do not have to deliver off-site standalone version of G-HUB for Win7.
Their toy as is, this is 40MB installer and the rest (160 MB) comes from CLOUD.
 
Oh yeah absolutely, but thats simply because 100 DPI isnt something people use.His findings are correct however, albeit with a smaller gain when looking at 400 DPI and up.
People knew this for years, I think cpate mentioned it on this forum a couple years ago.
Of course he did, and people here explained why it doesn't matter. This is the post you're talking about. Does this mean 800 dpi is the best on PMW3366? Also cpate says higher dpi 2-3ms more responsive | Page 2 | Overclock.net


But think about it. You need single count to move 0.01' degrees of motion at 400 DPI and 4 counts to move the same 0.01' of motion at 1600 DPI, assuming you're using same eDPI. Assuming you make this movement at the same speed, the entire movement will take exactly the same amount of time.

Again. Let's imagine that a target is exactly 10 degrees of rotation at the same settings as above. For 400 DPI it would take exactly 1000 counts and let's say 50ms of time. Assuming you're raising your DPI at 1600, you would need exactly 4000 counts for the same 10 degrees of rotation at the same eDPI.

And now? Will you hit this target faster? No! It would cost you, exactly 50ms of time. The same exact "total time" in both options, in other words, the difference is ZERO.

This is why measuring first on screen reaction is stupid in this case. Of course YOUR rotation will have a "first on screen reaction" slower at lower DPI values, because it's coarser. But you have actually zero advantage as soon as you need to move enough to hit anything in the game. It's not like your display is working slower or anything.

And I'm assuming that the guy tested at the same eDPI to begin with. Because he doesn't clarify anything.


Assuming this would translate at a net improvement of 50% less input lag is to assume that doubling the sensitivity would trigger half the input lag because you just need to travel half the distance for the same angle. It's just stupid conclusion for common behaviour.

Your mouse clicks will not trigger any faster either, your display will not display any faster, your polling rate will not be any faster.

We'll ALWAYS skip angles at some degree, we can't just divide angles for an absurd high value. To call 2 microns of threshold of movement an input lag advantage for using 12000 DPI is just stupid.

He didn't explained any of this.

Are you really missing because you're not flicking 0.0001' degrees of motion fast enough? I mean, let's just raise our settings to 20000 DPI and use 1cm/360. We'll be the fastest players on the world by FAR.
 
the entire movement will take exactly the same amount of time
Did you read any of what CPate said?
Every mouse movement your game does is scaled off of the given Input by the mouse, but your mouse can only report full counts.
So it happens earlier because at higher DPI the threshhold to reach a full count is reached faster, this gain happens between Sensor and MCU.
How high that is depends on your DPI and speed you're moving at. Your hand doesn't have instant 100% acceleration. You change direction, that happens earlier at higher DPI. You do some slow micro adjustments, guess what.
Is this a huge 50% input lag decrease? No, simply because the lower bound he tested doesn't make any sense.
Did I instantly change my DPI when I became aware of it? No because the gain is fairly small and I'm already at ~800DPI which I'm comfortable at.
IMO this is mostly a case against using 400 DPI or lower (if anybody uses that for whatever reason), or for the people that want the lowest possible total input lag.

And I'm assuming that the guy tested at the same eDPI to begin with. Because he doesn't clarify anything.
From what I can tell he didn't test ingame, but rather with some kind of software that displays a brighter image as soon as mouse movement is detected.
Even if you could somehow test that ingame, eDPI wouldn't matter.

20000 DPI and use 1cm/360
Very funny, except he already talked about diminishing returns.

If you don't care about these findings, thats perfectly reasonable.
The video is 100% clickbait I agree, but making fun of it because you don't understand is a different topic though.
 
Did you read any of what CPate said?
Every mouse movement your game does is scaled off of the given Input by the mouse, but your mouse can only report full counts.
So it happens earlier because at higher DPI the threshhold to reach a full count is reached faster, this gain happens between Sensor and MCU.
How high that is depends on your DPI and speed you're moving at. Your hand doesn't have instant 100% acceleration. You change direction, that happens earlier at higher DPI. You do some slow micro adjustments, guess what.
Is this a huge 50% input lag decrease? No, simply because the lower bound he tested doesn't make any sense.
Did I instantly change my DPI when I became aware of it? No because the gain is fairly small and I'm already at ~800DPI which I'm cofortable at.
IMO this is mostly a case against using 400 DPI or lower (if anybody uses that for whatever reason), or for the people that want the lowest possible total input lag.
I did read, and so did a lot of people replying to that same very topic. Saying "input lag advantage" is the WORST conclusion about this discussion. Does a higher resolution display have "less input lag" because it displays more pixels for the same amount of movement? No, because it's an stupid conclusion about a higher sample data for the same amount of time.

There's nothing wrong at using 400 DPI at low sensitivity. People use it because 400 DPI is already 63.5 microns of movement, which is enough to have high precision at human dexterity level. And I'm pretty sure it's still high enough granularity at about 30cm/360 at probably any case scenario.

It's exactly ZERO input lag advantage unless you can take advantage of the "extra counts" provided by a higher DPI settings at the same eDPI. In other words, if you you desperately need 0.0001' degrees of flick "advantage".

Yes. The screen starts moving "earlier", but at "subpixel" level of motion. A angle so small that it translates to exactly nothing. As soon as you need to move multiple counts to move from point A to point B (like any situation in a FPS game), this "advantage" means absolutely nothing.

Again. High DPI is something useful ONLY for high sens player, because they can actually took advantage of the "extra granularity" at those high eDPI values. Except no human being have a level of dexterity to justify stupid high levels of DPI. Even at 3200 DPI as he "suggests".


From what I can tell he didn't test ingame, but rather with some kind of software that displays a brighter image as soon as mouse movement is detected.
Even if you could somehow test that ingame, eDPI wouldn't matter.
It does if he was using a better methodology. First on screen reaction means nothing for this topic. You don't move your crosshair to ANY target at "first on screen reaction". Your clicks will not be triggering any faster to, so straight up saying "input lag advantage" is just stupid.


Very funny, except he already talked about diminishing returns.

If you don't care about these findings, thats perfectly reasonable. Making fun of it because you don't understand is a different topic though.
He didn't said nothing. He just said he doesn't "know" if the accuracy of the mouse have drawbacks at those high levels. A lot people asked if this would be translate at less "accuracy".

In other words, people are willing to raise their DPI at this stupid high levels if the accuracy is not affected.

50% "less input lag" for nanoflicks of movement isn't "input lag advantage".

50% "less input lag" at "subpixel movement" except the mouse clicks doesn't trigger any faster, is NOT "input lag advantage".

These are not "findings", it's just bad methodology and a clickbait video. Are you a fanboy of him or something?

-
I repeat. Want "less input lag"? Play at 5cm/360 and 12k DPI. Technically at 2 microns of movement threshold this high sensitivity value will actually give you a REAL and BIG "input lag advantage" because your hands doesn't have "instant acceleration in the real world". Any modern sensor doesn't have "drawbacks" at playing at those stupid high DPI levels.

I mean, comparing to puny 30cm/360, at 5cm/360 you ACTUALLY do have 600% "input lag advantage" at moving from point A to point B. Not only at "first on screen reaction" at high DPI but you actually DO have advantage for an entire arc of movement like rotating 180 degrees.

Are people handicapping themselves for no reason? What's wrong with those pro players playing at 400 DPI and low sensitivity?

Of course you can arguee that some games doesn't have a low enough sensitivity sliders but most of then do.

What are waiting for?

Can't wait for the high top tier gaming players mastering nano movements of count threshold when the 50000 DPI sensors flood the market.
 
This is getting a bit ridiculous, you are just vomiting words at me that are mostly unrelated to the topic.

except the mouse clicks doesn't trigger any faster
Well yes it doesn't affect how fast your clicks register, but do you not ever move your mouse in games? The idea of inputlag just aswell applies to how fast you go from actually moving your mouse, to seeing that movement on screen.
This isn't the holy grail of inputlag reduction, in fact it is rather minor, but arguing against these results because you don't understand how they apply is rather idiotic.

People tried to prove CPate wrong with some whacky math, this video literally proved CPate right.
Are you arguing these results were wrong? Where does this reduction in inputlag come from then.

Yes. The screen starts moving "earlier", but at "subpixel" level of motion. A angle so small that it translates to exactly nothing
Only if that's how you scale your ingame Sensitivity, which would be incredibly slow, so not applicable to most people.
But earlier regardless, so there's a benefit.

He didn't said nothing
2:27 since the timecode doesn't seem to work. Just because you didn't listen doesn't mean it hasn't been said. You could literally arrive at that conclusion by simply looking at his measurements.
 
This isn't the holy grail of inputlag reduction, in fact it is rather minor, but arguing against these results because you don't understand how they apply is rather idiotic.
I understand perfectly all this, but you (and some people) thinks "on screen reaction" applies to this subject as a measure of "input lag" when in fact it doesn't. You should measure a big enough arc to have enough data to trigger an "useful" movement, like moving your crosshair to a target, not a single count.

Why don't you play at 4k if you think bigger resolution means "less input lag"?

I'll quote someone from that same topic who explained perfectly what's going on here:

I definitely think qcxvcsvxc's conclusion is the best way to conceptualize it: coarser versus smoother tracking of your mouse movements. The irony here is that the higher DPI's "smoother" tracking is actually the more "responsive."

To come up with a single name for this phenomenon, so as to differentiate it from lag or smoothing or whatever else, what about "micro-response"? Something like quantum movement latency would be more descriptive, but very dumb sounding.
Input lag is NOT the correct word here. It's confusing and incorrect when talking about this subject.

People tried to prove CPate wrong with some whacky math, this video literally proved CPate right.
Are you arguing these results were wrong? Where does this reduction in inputlag come from then.
Those people didn't "tried" anything. They just corrected him, and the math wasn't "wrong". The "input lag reduction" is just the result of a bigger mouse resolution, or "more counts" for the same amount of movement. Except this should NOT be called "input lag reduction". It's a different effect that you can "interpret" at less input lag if you're dumb enough, it's just the nature of a higher sample data for the same amount of time as I've said already.

Again. Go play at 4k if you think bigger resolution means less "input lag". It's "more data" for the same amount of time. I mean, it's 4 times the pixels so it's four times faster right?

You'll NEVER take advantage at single counts reporting "earlier" unless this counts can trigger enough movement to hit anything on the screen. NO ONE FLICKS 0.000001' degrees of motion. High DPI is something that matters ONLY for high sensitivity players, because those people can actually skip big angles at low DPI.

Except no human being have enough dexterity to justify stupid high DPI values anyway. So playing at STUPID high level of sensitivity at any "competitive" level will never happen.

2:27 since the timecode doesn't seem to work. Just because you didn't listen doesn't mean it hasn't been said. You could literally arrive at that conclusion by simply looking at his measurements.
Yep. A bad conclusion about the nature of higher sample data for the same amount of time. This isn't "input lag". The problem here is the term he used. As Cpate did to.

This is why a bad methodology can lead to a bad conclusion.

Only if that's how you scale your ingame Sensitivity, which would be incredibly slow, so not applicable to most people.
But earlier regardless, so there's a benefit.
In the real world it isn't. It isn't "applicable" for no one except people with CRAZY HIGH sensitivity. No one can play at such high sensitivity AND be precise at the same time.

A "earlier" movement at subpixel level means absolutely nothing about moving enough degrees to hit any target. Your display will not work any faster though, neither will any other inputs, INCLUDING INPUTS FROM YOUR MOUSE. This is why DPI is MOUSE RESOLUTION, and not "mouse speed".

As I said, the only way to trigger a real benefit is to raise your overall sensitivity at the same time. Technically raising sensitivity is not "input lag reduction" either, but because you'll need less movement for bigger arcs, you can call it something like "an input lag reduction at the cost of precision".


-

You can believe in whatever placebo you want though. Are ou a fanboy from r0ach to?

I honestly don't have the patience to explain any better, so stick with the configurations you want and believe in whatever you want. Critical thinking is not for everyone, I know.
I mean, look at how many fanboys r0ach and FR33THY have.
 
Why don't you play at 4k if you think bigger resolution means "less input lag"?
...
Go play at 4k if you think bigger resolution means less "input lag".
Monitor resolution has absolutely nothing to do with this.
You may want to learn how displays work, unless you want to bringt this up a couple more times to prove how little you know.

like moving your crosshair to a target, not a single count
Ok let me try one last time. Every single movement, not just the first, can be faster or earlier (talking on screen translation) with higher DPI. Can be because it's kinda similar to how you bruteforce incredibly high FPS in some games to have a chance to see a more recent frame. You don't lose this advantage after the first count, it can apply to literally every single movement you want to translate from physical to digital space.
Ultimately you're still limited by how fast your monitor can display that movement, which is also why the average benefit appears smaller on 360Hz than 144Hz.
Ideally you would have to try to measue this on a sensor level to get a true grasp of the difference. Seeing what kinda real world impact it can have isn't to bad though IMO.

How you describe or call it is up to you. Given that the mouse is an input device and using this you have a chance to have your real world action appear faster on screen I'm content with calling it inputlag.

If you just wouldn't care that's whatever, I really don't care for the difference either, the thing is that you simply don't seem to understand.
 
My intention this is my report to bring tears in your eyes.
Mouse hardware and monitor screen and entire PC, they are 1000 times faster than any running software .

Whom told you that Logitech and or Corsair, they have hire the finest software developers out there?
Especially about software developers hired by Logitech, they do not even have their dedicated email address.

One week ago I did Install G-Hub at my Win 7 Pro 64bit , and the memory footprint of this badly designed software this is tremendous.
My Logitech mouse and keyboard they came both with defects and I did returned back for refund.

My next option for a mouse with latest sensor for gaming was now the Microsoft Pro Intellimouse
If Microsoft this is good at something, this is at using qualifying software developers, the outcome is well made software at wasting less memory resources, and warranted collaboration with the operating system.

Three screen shots for you, so to admire the superiority of software developers working at Microsoft.

2515234


2515235


2515236
 
I know im resurrecting, however I just discovered this thread and I have to join in.
Usually I keep to myself and I dont have many others to discuss things, I just notice them and have to figure it out on my own. Years ago I bought a G502 and I LOVED it. I couldn't describe the tracking but it just felt amazing and it was enough to make me replace my old tired G9 finally.

Fast forward years later and I kept hitting these points in my life where I felt like my mouse was weird and I went through DPI changing chaos trying to get "something" back. It defintetly started AFTER I finally decided to start using Ghub (I was NOT using it for the longest time.)

Finally about a year ago (maybe when this thread was made) I broke down and bought the Glorious Model O Wireless, I have yet to install any software, and "that feeling..." I was looking for, was back. Ive gone so far as to have bought a SECOND Model O Wireless, and it STILL feels perfect. Yet as much as I RATHER use my G502 (The sideways scroll wheel is perfect for additional weapon slots in games. I never move my other hand off of the movement controls when I use a G502) I just feel like I have to WORK for the G502, where as the Model ) works for ME.

And now I discover this thread.

Its just frustrating to me because I swore I kept feeling hickups in my mouse, and it forced me to finally end up buying another companies product. After reading this thread, Im interested in buying a new G502 and seeing what it does.

PS: I know what you guys mean about having to use Ghub to control RGB. Might I suggest you guys come check out OpenRGB.org? Its a project Ive been around that is working to make RGB control across all hardware types, as opensource.

Currently OpenRGB DOES SUPPORT G502, G915/G915 TKL (all the Logitech stuff I own) and much more. Just wanna help some fellow computer gaming buddies out.
 
  • Rep+
Reactions: TREBOR_239
141 - 153 of 153 Posts