Originally Posted by popups Comment (Click to show)
True, I am assuming the G5 wouldn't be much better than the G400 (when using 800 CPI). However, my assumption isn't based on the fact it uses a different light source or because of performance of later laser sensors. I assume the G5 isn't better (considering a human's perception) than the G400 because they both (from what I gather) use a "native" 800 setting. They also use a higher "native" value, 2000 for the G5, 3600 for the G400. I assume the G5 halves the 400 setting like the G400 does with its other two settings. Obviously, I am ignoring the coding, as I cannot compare such a thing.
DPI value could be inherit to DSP scaler unlike G400's which from what has been gathered uses controller to throw away counts. At least if it's assumed that 6006 share the same capabilities as public A6010.
The user's perception of the G5 being superior than the G400 is likely due to the fact the G5 has over 600 more frames than the G400. Different frame rates can change the "feel" of a sensor noticeably, that different feeling isn't necessarily an indicator of superiority. Higher frame rates can really mess with your mind... that is if you can discern such things.
Of course, I mentioned this at the start of the thread and multiple postings after, though FR is more of a side effect variable. Having used a modified FR variation (7200/12000) of UGS, I can attest to a noticeable increase of delay (7200 FR) in addition to release 9800 SROM @ 12000 FR; Something that never should have been made public and makes current 4k 3090 seem like the best thing in the world. (I kid you not)
Lets assume data flow was equal on both 6010 and 3080 (two sensors of same generation). 6010 would indeed have that benefit of extra frame rate; less "laggy".
There is certainly comparable we'll say "smoothed properties" (for sake of subject consistently) inherit to A9500 vs 4k 3090, though FR bump put it more in line with older releases in terms of general user conscience.
Ironically, having a higher FR also decreases cursor precision and consistency via specific CPI level.. Settting FR too low decreases cursor smoothness. I don't know about you, but I feel 30*30 @ 6400 FPS is really the best compromise for general performance at moderate CPI.
This is considering FR is also high enough to offer decent IPS speed as well.
I don't think it is "all of a sudden". It has been mentioned many times before. Maybe they didn't articulate it properly. Haven't people compared the Avago sensors to the MLT for years now? Stating the MLT still has a "better" feeling/tracking/response than Avago sensors. Didn't they complain about non "native" settings feeling bad? After experiencing those "native" settings in the same sensor... I think people are starting to realize that "native" values are still not as good as simply using the hardware limitations. It took years to get to this point of understanding, as people had to use different variations of sensors. Seems like people are now (at least vaguely) understanding what it is they want. If that happens to be a small amount of people -- isn't a reason to ignore -- they are still customers/money.
Not really fair to compare an alternative sensor architecture as they'll all feel different and variate from each other ^^
I kinda meant all of a sudden as in taking 1 post for everyone to stop looking at 3090 as the best thing ever. Granted I do believe 4k 3090 is fine for a majority of users and not as noticeable as couple other releases. Whether this kind of thing is an issue is up to the player or consumer.
Things will change if there's enough support from community and project managers via whatever company they belong too.Edited by Skylit - 11/9/13 at 1:36pm