Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Video Game News (https://www.overclock.net/forum/227-video-game-news/)
-   -   [PCGamesN] Nvidia says it has offered anti-lag settings like AMD’s for “more than a decade” (https://www.overclock.net/forum/227-video-game-news/1727450-pcgamesn-nvidia-says-has-offered-anti-lag-settings-like-amda-s-oemore-than-decadea.html)

WannaBeOCer 06-12-2019 04:45 PM

[PCGamesN] Nvidia says it has offered anti-lag settings like AMD’s for “more than a decade”
 
Source: https://www.pcgamesn.com/nvidia/nvid...on-alternative

Quote:

“So AMD introduced a couple things we read about and heard about,” Tony Tamasi, VP of technical marketing, says. “Radeon Image Sharpening and FidelityFX…. So we, of course, have similar techniques, for quite some time. Freestyle about a year and a half ago. It has a large number of filters, which includes things like sharpening and colour correction. If people find sharpening by itself particularly cool and valuable, we can kind of invest in that. But we’ve had that kind of functionality in Freestyle, with a bunch of other post processing effects, developed for quite some time.

“The other thing they talked about was Radeon Anti-lag. I haven’t got a particular good explanation about what’s going on with its CPU/GPU load balancing to reduce latency. That can mean a lot of things to be honest…. We’ve had some stuff for reducing latency, lag, whatever you want to call it, for quite some time. If you look at our control panel, this has been around for more than a decade.”
FreeStyle: https://www.geforce.com/en_GB/gfecnt...rience-article

Reduce lag: https://www.geforce.com/whats-new/gu...of-lag-guide#1

zealord 06-12-2019 04:48 PM

Yeah no idea if it is the same, but I am wondering what it is about. I mean it sounds great, but without knowing how AMDs anti-lag actually works it is hard to say if it actually improves my gaming experience or if it is just some sort of marketing gimmick

Darren9 06-12-2019 04:50 PM

We already know Nvidia has everything first, what's important is it isn't cool until AMD does it :)

epic1337 06-12-2019 05:06 PM

would it have something to do with this?


NightAntilli 06-12-2019 05:21 PM

Does it really matter if they had it if Navi has lower input lag?

Additionally, AMD claims that they decoupled the input lag from the framerate. At this point I'm unsure if that's true, but... Did nVidia really do that?

It actually sounds like nVidia is in damage control mode. Couldn't be, could it?

TFL Replica 06-12-2019 05:26 PM

ReShade, SweetFX+InjectSMAA, and many other tools, have been providing better/more shader effects for far longer than Freestyle (which is still hidden away in GFE beta).

EastCoast 06-12-2019 05:37 PM

Quote:

Originally Posted by NightAntilli (Post 28001944)
Does it really matter if they had it if Navi has lower input lag?

Additionally, AMD claims that they decoupled the input lag from the framerate. At this point I'm unsure if that's true, but... Did nVidia really do that?

It actually sounds like nVidia is in damage control mode. Couldn't be, could it?

They are in damage control mode. Why would they care about Anti-Lag and sharpening when they have ray tracing :rolleyes:
Goes to show you the lack of confidence Nv has in ray tracing in games when they have to shore up AMD's talking points of features other then RT.

I recall RTG talk about reducing latency in drivers a while back too.

skupples 06-12-2019 05:45 PM

does the increased image quality stuff stack with the anti-lag stuff?

seems like they wouldn't.

woohooo, amd finally has a 1440p high FPS card too! They're really gonna need to pimp these features to get many people to dive in @ this point in the cycle. I think they may be aware of that, and thus why we aren't getting any sorta power-house attempt yet. Their cpu game seems agressive, but gpu seems a bit conservative & cruise control at the moment.

ZealotKi11er 06-12-2019 06:01 PM

Quote:

Originally Posted by EastCoast (Post 28001970)
They are in damage control mode. Why would they care about Anti-Lag and sharpening when they have ray tracing :rolleyes:
Goes to show you the lack of confidence Nv has in ray tracing in games when they have to shore up AMD's talking points of features other then RT.

I recall RTG talk about reducing latency in drivers a while back too.

Project ReSX I think

https://community.amd.com/community/...g-project-resx

skupples 06-12-2019 06:01 PM

gonna double post, don't care...

how is it i've never heard of FreeStyle before? I even have NVupdate 2.0 installed on my pc! and how is it any different than the filters in my fancy ass asus proart monitor?\

ZealotKi11er 06-12-2019 06:04 PM

Quote:

Originally Posted by skupples (Post 28002008)
gonna double post, don't care...

how is it i've never heard of FreeStyle before? I even have NVupdate 2.0 installed on my pc! and how is it any different than the filters in my fancy ass asus proart monitor?

I think FreeStyle is nothing like AMDs implementation.

skupples 06-12-2019 06:05 PM

the demo they give reminds of exactly what I can do while tweaking my monitor -

ALSO if you actually read that article, specially the conclusion. NV says "yeah, lag happens"
-------------------
Try as you might, some form of lag is always going to occur in PC gaming. Unless you have a megabuck machine and a perfect connection to the best game servers available, one or more of the forms of lag above will hit you from time to time. The trick is to understand what's causing your particular flavor of lag, and put the right solution into effect.

So the next time the cry "Lag!!!" goes up while you're playing a game, just remember that such a simple word has a whole bunch of complex causes. And remember, lag is also the perfect excuse if your aim is off ;)
-------------------
nv chalks up lag complaints to git gud, references it as a counter to AMD's stuff, cuz they know no one reads more than a paragraph.

things lagging? turn down your settings, and turn of V-sync - NV options.

don't take my word for it though, i've only not purchased AMD since 6xxx, n even then I only got a single slot hold-over.

also -

out of date data in provided info for rebuttal

https://www.tweakguides.com/TGTC.html ?(MONETIZED WINDOWS OPTIMIZATION GUIDE FROM A THIRD PARTY!!!!)

its not free, and there isn't a win10, or 8.1 version... ;)

BradleyW 06-12-2019 06:12 PM

Nvidia's anti lag method is just flip queue. This is dated. It can cause stuttering if set too low, or higher input lag if too high. I think AMD are using something modern and completely different. As they said, it is using a method whereby it decouples frame rate from lag. I sense Nvidia are on the back foot and fearing the unknown with AMD's 2019 and 2020 Navi stuff.

skupples 06-12-2019 06:14 PM

Quote:

Originally Posted by BradleyW (Post 28002034)
Nvidia's anti lag method is just flip queue. This is dated. It can cause stuttering if set too low, or higher input lag if too high. I think AMD are using something modern and completely different. As they said, it is using a method whereby it decouples frame rate from lag. I sense Nvidia are on the back foot and fearing the unknown with AMD's 2019 and 2020 Navi stuff.

100% agree with you, from the peanut gallery where I don't have a clue what's happening on the back-end.

That article is a pathetic rebuttal to what AMD announced. No one's given a damn to revisit it since Win8.0 either.

great ideas, I'm looking forward to them being vetted by the community, then being implemented on the 4K120HZ power house AMD will be releasing a year after consoles.

tpi2007 06-12-2019 06:45 PM

Considering that

1. Radeon anti-lag will also work on GCN based GPUs;
2. It only works on DX 9 and 11, but not on DX 12 and Vulkan.


I'd say that they tapped into Vulkan at some stage in the graphics pipeline to bring the benefits of these closer to the metal API's to DX 9 and DX 11. I wouldn't be surprised if they had learned something from the Vulkan-based translation layers for Linux DXVK (for DX 10 and 11 games) and VK9 (for DX 9 games).

And yeah, if it's that it's a neat trick and Nvidia will probably have to follow suit and copy the technique, but for now it seems that they resorted to bullcrap to pretend that they've had it for a long time and hope that people don't notice the substance until they are actually ready with a comparable solution.

There is one thing though I have to ask, the solution may not always bring the desired effects, why would you otherwise not have it on always? Is it just an option so that you can get to see the before and after, but ultimately meant to be always on? Or does it bring some problems in some games and you need to turn it off?

skupples 06-12-2019 06:59 PM

it's probably like lod tweaking to some degree :P

m4fox90 06-12-2019 09:07 PM

Much like how ray-tracing was around for more than a decade?

GHADthc 06-13-2019 01:56 AM

That Nvidia article on Lag is a complete joke! What exactly do they have to mitigate latency other than whole-system optimizations that people have been doing since forever now? Nothing! What a bunch of smoke being blown up peoples a$$es lol!

Aristotelian 06-13-2019 02:19 AM

How long until we get a chance to buy anti-lag modules for the low low price of 600 bucks?

Redwoodz 06-13-2019 06:32 AM

Quote:

Originally Posted by Aristotelian (Post 28002360)
How long until we get a chance to buy anti-lag modules for the low low price of 600 bucks?


Usually right before AMD releases it for free. ;)

VeritronX 06-13-2019 06:35 AM

This is probably something akin to the "Maximum pre-rendered frames" setting in the nvidia control panel.. which used to have 0 frames as an option but hasn't had options lower than 1 pre-rendered frame since I think kepler launch? I remember I used to have it set to 0 with my GTX480 when that was an option. This also makes sense since nvidia gpu's don't have a hardware scheduler since Fermi so the driver has to have some buffer to work out what it's doing on the cpu.

If this is related to AMD gpu's still using a hardware scheduler then nvidia can't do much about it with any of their currently supported gpus unfortunately.

skupples 06-13-2019 08:12 AM

Quote:

Originally Posted by m4fox90 (Post 28002186)
Much like how ray-tracing was around for more than a decade?

since DX9!

ilmazzo 06-14-2019 06:21 AM

I expected something like "screw amd, they just glue driver releases each other, nothing new"

JackCY 06-14-2019 06:59 AM

Quote:

Originally Posted by NightAntilli (Post 28001944)
It actually sounds like nVidia is in damage control mode. Couldn't be, could it?

Yes they are completely OCD about having a response to everything their competition even their partners do. Every single word not just product.

I didn't find a single anti lag setting ever in NVCP or in the linked guides one of which is paid download from some 3rd party for Win optimization! They got squat nothing especially if AMD has lower latency with Navi.

Only some games support Ansel and even those they advertise that do their initial version that I played did not or it was outright disabled because well multiplayer.
The effects are childish at best and only worthy thing in there is screenshot capture at stitched resolutions and automatic conversions to wide angle views.

Freestyle... unless you have GFE bloatware (in beta mode still probably) that injects it into your games you're not getting it. It's effects similar to Ansel, probably of low value unless one needs to fix some crappy in game tonemapping. Still you gotta have GFE installed and running, no thanks. All that people seem to do with Freestyle is oversaturate colors so that they look artificially fake and pop out more on people's poor TN panels. Same FAD that's been going on with NVCP digital vibrance etc.

---

Ray tracing has been around since CPUs exist.

skupples 06-14-2019 07:08 AM

^^ this,

best part is they have a monetized link for windows optimization...

sticks435 06-16-2019 06:58 PM

It used to be free, so were the game tweaking guides. Then Nvidia started paying him to do TWIMTBP games exclusively around 2011 with Deus Ex:HR and Mass Effect 3.

skupples 06-16-2019 07:26 PM

rep for the history

however, since the article was made in 2018, it's always been a paywall... at least for those who found it via that link. The link NV provided in their terrible rebuttal to AMD's limelight.

is there even a comparable product in the win10 era?

UltraMega 06-16-2019 07:48 PM

I don't see anything in the article to back up the claim in the headline. Is it weird that I knew who the OP was gonna be before I even opened this thread?

There are lots of ways to reduce image quality and in turn reduce input lag/increase FPS. AMD's claim is that if FPS is ~even, they will have lower input times and they backed this up with demos. I'm sure Nvidia will scramble to counter this somehow, but this isn't it.


Stop spreading propaganda for Nvidia.

VeritronX 06-16-2019 08:07 PM

The setting they're reffering to pretty much has to be the max pre-rendered frames one as it's the only one I can think of that's been in there that long and directly effects input lag. Pretty sure the default for most things is 3 frames (or at least I remember it being that for gta5) and you can turn it down to 1 frame these days. If you are running at 60fps locked then setting it to 1 frame would reduce your input lag by ~33.4ms.

The key difference here is AMD can reduce it to 0 frames because they still use a hardware scheduler in their gpu designs, while nvidia is stuck with a minimum of 1 frame delay because they do it in the driver on your cpu.

JackCY 06-16-2019 08:50 PM

The default on most games is 1 pre-rendered frame, if not do yourself a favor and force it to 1 in NVCP but who still uses 3 frames in their game engines?

I suppose Navi still has a HW scheduler and that's good I think. Nvidia moved this kind of load to CPU with Kepler hence their performance is more CPU tied and CPU limited and also loves to cause higher DPC latency as the driver takes long time sometimes/every new launch lol.

Where is that magic setting in NVCP to reduce latency? Please Nvidia do tell us, because pre-rendered frames isn't it, that's something that's supposed to be and is set by the application/game and should only be force changed in NVCP when the change works and doesn't cause issues.
Even Polaris 10 has almost 10ms advantage vs GP106, 3rd party measurements with a high speed camera.
RX580 vs GTX1060
16.87ms vs 25.33ms, no sync
Best cases: 16.87ms vs 18.87ms

There is latency to be cut down in drivers of both corporations. They bloat them endlessly year over year. 500MB NV driver is freakin' unreal. Not that AMD's 352MB is any better. That's some damn lot of code that one can fit into so many MBs.

ToTheSun! 06-17-2019 04:55 AM

Quote:

Originally Posted by JackCY (Post 28007390)
Even Polaris 10 has almost 10ms advantage vs GP106, 3rd party measurements with a high speed camera. https://youtu.be/L42nx6ubpfg?t=850

RX580 vs GTX1060
16.87ms vs 25.33ms, no sync
Best cases: 16.87ms vs 18.87ms

You cannot use that example to make a generalization about nVidia cards and latency. When G-sync was on, the latency was similar to Freesync with an AMD card. Latency with nVidia cards is as expected in other monitors.

epic1337 06-17-2019 06:09 AM

Quote:

Originally Posted by ToTheSun! (Post 28007644)
You cannot use that example to make a generalization about nVidia cards and latency. When G-sync was on, the latency was similar to Freesync with an AMD card. Latency with nVidia cards is as expected in other monitors.

i might've missed it but does it talk about G-Sync (with module) or G-Sync compatible (without module)?

though on a side note, its actually a reasonable comparison, specially when the charts consists of both with or without FreeSync/G-Sync.
this makes the comparison flexible enough that you can see which gives the most latency advantage, and arguably AMD does kinda do better across the board.
so with that said, it showcases that those who doesn't have a G-Sync monitor aren't at a disadvantage at all.

skupples 06-17-2019 06:51 AM

that latency bit is interesting. I'd love to see some of the heavy duty folks break it down across their giant hardware libraries so we really know what's going on.

looniam 06-17-2019 07:11 AM

will any of it make a difference in AC:U?

skupples 06-17-2019 07:15 AM

so it wasn't just me? good. AC is garbage more often than not, that one just kinda takes the garbage cake.

JackCY 06-17-2019 11:22 AM

I don't have AC:U but I play different AC and tested ACC = Assetto Corsa and AC Competizione and those have some nasty input filtering in the game engine as well even dependent on frame rate, normally you may not notice but once you start doing custom key binds you find out you need to add delays for key/button presses for the game to have time to recognize them.

The cake of worst latencies from recent games takes Apex Legends, that thing is brutal in multiplayer which ultimately is the only game mode it has, even on online training map it's fairly snail inputs wise. But here the latency is caused by server-client operation, confirmation of actions by server, revocation of actions by server (=rubber banding, teleporting, no hits, ...), again latencies of client to detect actions of user, ...

Most modern games once shaders were introduced are a far cry from how snappy older games that didn't use modern graphics with shaders could run. The complexity and latency has increased quite a bit with more modern games/engines, in the input handling, in "logic"/physics/actions and in graphics processing latency (you can have 1000fps but what good is it when it's delayed by 1s from all your inputs, an example). fps is only one part of the "story", latency is another.

I don't have a high speed camera to bother doing comparisons, they can be found occasionally for various things. Be it monitors, GPUs, games, peripherals, ...

Hopefully at least something useful comes out of VR... a reduction of latencies overall.

Feel free to link any more comparisons of AMD vs NV latency, monitor fixed vs Async refresh, ...

skupples 06-17-2019 11:31 AM

I could never get into PUBG because of some server side latency I could never get around. It drove me nuts. It's still the only game I've ever refunded via steam's policy.

looniam 06-17-2019 12:43 PM

1 Attachment(s)
Quote:

Originally Posted by skupples (Post 28007748)
so it wasn't just me? good. AC is garbage more often than not, that one just kinda takes the garbage cake.

OT:
i though the witcher 3 was bad, playing mostly fps like FC2/3/5 and crysis 2/3 (never much for COD or BF). but i did get use to it and actually got the timing down.

but AC:U, got that notre dame PR thingy, man i don't how many times i've raged, screaming, "do something damnit!!" besides getting super glued to ledges. however, ya know it was free and having never played any of the series, i do sorta like a few aspects about, though finding a few things is a pita but thats what online walk through a for, eh?


btw, no idea why i thought of you when i saw this:
Spoiler!

skupples 06-17-2019 12:49 PM

AC:U is definitely one of the worst entries in the title.

As to the two new ones, they're so similar (though one's Grecian, the other egypt) that picking one & playing it will suffice for both.

I feel like Ubi could do so much more with the title if they didn't rebuild it from the ground up. ever. single. time. Like, the newest one - use that as a platform to build an even more in-depth one. It worked out all the weird scaling issues of Egypt, & the animation issues were just down to it being a rushed product (extreme valley issues)

JackCY 06-18-2019 04:36 PM

My best guess so far from seeing the AMD slides and talks, it may really be only pre rendered frames = 0 enforcement via a driver toggle. Something NV may not be able to do and their minimum remains 1 frame.
It would definitely be nice to see some stream lining and optimization of drivers to reduce latency as much as possible.

1Kaz 06-19-2019 11:39 AM

I think people confuse lag with input lag. They are very different. With lag, stuff happens but you see it late. With input lag, stuff happens but your response is slow. There is nothing more frustrating than input lag in an FPS shooter. I went from a Zowie EC2 Evo to a Logitech G 403. The Zowie had a measured response time that was 15.2 ms higher than the G 403. What a difference it made for FPS! They were actually playable instead of infuriating. That feeling of, I know I clicked well before I died, yet my gun never went off, is enough to make anyone quit gaming.

AMD's changes to input lag are a game changer. nVidia may have focused on this long ago, but it's no longer relevant to their offerings vs the competition. AMD has them beat on something gamers actually care about. I know programmers get excited about Ray Tracing due to it's potential, but the FPS/resolution just isn't there yet. Gamers turn off Ray Tracing for better FPS. nVidia backed the wrong horse and used it to justify high prices. The real reason their prices were high; their products hit the shelves first.

JackCY 06-19-2019 12:19 PM

It's called motion to photon latency and latencies from all devices in the chain are in it be it mouse hardware, mouse driver (for input used as input), other drivers and software in the OS, system load, the game/software itself, GPU driver and GPU hardware (that's something AMD and NV can do something about), monitor driver, firmware and hardware + panel response and refresh.

Input lag is generally used as a simplification and for latency on input of devices. Aka how long it takes them to show a change of input on their output.

Tracing is here to stay and long time needed. It can be written fast performing but that's not something large studios are doing now, they simply bolted expensive effects onto existing game engine instead of making a full on path tracer from scratch and proper.

Did not like any Zowie mice I tried, they did seem sluggish, badly shaped, poor quality with dated sensors at high price.

ilmazzo 06-28-2019 01:25 AM

Quote:

Originally Posted by 1Kaz (Post 28010770)
I went from a Zowie EC2 Evo to a Logitech G 403. The Zowie had a measured response time that was 15.2 ms higher than the G 403. What a difference it made for FPS!

you got my attention

did you measured by yourself or your data just rely on someone else measure (reviews?) ?

skupples 06-28-2019 02:42 AM

i'm kinda curious what a post Jehnsen NV is gonna look like. Do you think they'll continue to make these weird ego statements after he's forced to step down in another 10 years or so?

I wonder if they'll erect a massive statue of him. Just imagine... in his favorite jacket, sporting sun glasses... holding a chunk of woodscrew Fermi in his hands.

illmazzo - usually blurbusters provides that kinda info to the most accurate degree.

ilmazzo 06-28-2019 02:51 AM

Quote:

Originally Posted by skupples (Post 28019888)

illmazzo - usually blurbusters provides that kinda info to the most accurate degree.

yo man, thanks for the info

1Kaz 06-29-2019 06:37 PM

Quote:

Originally Posted by ilmazzo (Post 28019840)
you got my attention

did you measured by yourself or your data just rely on someone else measure (reviews?) ?

https://www.overclock.net/forum/375-...omparison.html

I didn't measure it myself, but it was a notable difference. It made me want to buy a 2080ti and new monitor, because my monitor has a 5 ms delay (IPS panel). I haven't upgraded my graphics card or monitor because it's hard to justify the expense when I can still play games. I am not heavily investing my time into FPS games because of that. It makes a difference.

ToTheSun! 06-30-2019 05:33 AM

Quote:

Originally Posted by 1Kaz (Post 28021534)
https://www.overclock.net/forum/375-...omparison.html

I didn't measure it myself, but it was a notable difference. It made me want to buy a 2080ti and new monitor, because my monitor has a 5 ms delay (IPS panel). I haven't upgraded my graphics card or monitor because it's hard to justify the expense when I can still play games. I am not heavily investing my time into FPS games because of that. It makes a difference.

It's a little weird that you're skipping the most important aspect of skill (practice) only because you're missing factors that you perceive to be overly important (the confusion between response time and latency of a monitor and button delay of older Zowie mice).

Professional players have done tremendously well in the past (and some still do to this day) with mice that have less-than-ideal button delay and monitors with higher latencies than some of the 165 Hz IPS displays, but they NEVER do well without practice and time investment.

1Kaz 06-30-2019 10:54 AM

Quote:

Originally Posted by ToTheSun! (Post 28021818)
It's a little weird that you're skipping the most important aspect of skill (practice) only because you're missing factors that you perceive to be overly important (the confusion between response time and latency of a monitor and button delay of older Zowie mice).

Professional players have done tremendously well in the past (and some still do to this day) with mice that have less-than-ideal button delay and monitors with higher latencies than some of the 165 Hz IPS displays, but they NEVER do well without practice and time investment.

Skill matters, I never said it didn't.

JackCY 07-01-2019 11:35 AM

Only until AI will be playing games better than we could and people get so brainwashed all they will be doing is watching streams of AI playing games whether they will realize it's AI or not.


So where are those magical settings Nvidia? Come on come on don't keep us waiting.

senileoldman 07-01-2019 01:19 PM

Quote:

Originally Posted by 1Kaz (Post 28021534)
https://www.overclock.net/forum/375-...omparison.html

I didn't measure it myself, but it was a notable difference. It made me want to buy a 2080ti and new monitor, because my monitor has a 5 ms delay (IPS panel). I haven't upgraded my graphics card or monitor because it's hard to justify the expense when I can still play games. I am not heavily investing my time into FPS games because of that. It makes a difference.

That chart is a measure of click delay, not actual motion delay, which should be pretty much equal.

Omega X 07-01-2019 02:06 PM

For a company that its leadership claims that its on top, they sure are very insecure.

skupples 07-01-2019 02:19 PM

Quote:

Originally Posted by JackCY (Post 28023024)
Only until AI will be playing games better than we could and people get so brainwashed all they will be doing is watching streams of AI playing games whether they will realize it's AI or not.


So where are those magical settings Nvidia? Come on come on don't keep us waiting.

i saw a friend of mine watching a super mario bros 2 maker play through of the same map thru all the different time period skins.. it had like 6,000,000 views. no talking. Just some dude running his own custom map, thru all the time periods. Also, hasn't AI won in Top downs? an AI shooter would own, he'd just aim-bot everyone. We've had that for years :P

the return on 6,000,000 views isn't bad.

ToTheSun! 07-01-2019 03:21 PM

Quote:

Originally Posted by skupples (Post 28023214)
an AI shooter would own, he'd just aim-bot everyone. We've had that for years

Not really. In order to shoot you, AI has to be able to recognize players against the environment, which is its biggest bottleneck. Also, it's rather primitive (or not at all versed) in player psychology, which is a really big part of skill.

skupples 07-01-2019 06:31 PM

true, aim bots aren't really AI.

m4fox90 07-01-2019 08:29 PM

Quote:

Originally Posted by ToTheSun! (Post 28023284)
Not really. In order to shoot you, AI has to be able to recognize players against the environment, which is its biggest bottleneck. Also, it's rather primitive (or not at all versed) in player psychology, which is a really big part of skill.

Feels like, maybe, we shouldn't be teaching computer programs how to recognize and kill human beings

ToTheSun! 07-02-2019 01:30 AM

Quote:

Originally Posted by m4fox90 (Post 28023514)
Feels like, maybe, we shouldn't be teaching computer programs how to recognize and kill human beings

Why? Worst case scenario, we'll play Battle Royale against some bots in real life.

huzzug 07-02-2019 04:01 AM

Quote:

Originally Posted by ToTheSun! (Post 28023644)
Why? Worst case scenario, we'll play Battle Royale against some bots in real life.

Imagine Paintball. I'd no longer need to bring friends to play with.

JackCY 07-02-2019 11:31 AM

Quote:

Originally Posted by skupples (Post 28023444)
true, aim bots aren't really AI.

Depends what access the AI gets to the game, often they have to limit the AI in response times and access to information as there is a gigantic difference to having to process the 2D image by AI a player sees and having 3D coordinates for everything on the map at any time.
AI bots have the 3D coordinates snooped out of memory and then recalculate aim so that your camera/gun aims at target instantly. To prevent such an easy anti cheat detection of this they add a bit of variability nowadays along with it not snapping to target but moving to target at limited speeds and then gluing to the target. The biggest give away of cheaters with aimbots and wallhacks is reacting to enemies precisely and without sound queues behind obstacles, trying to shoot enemies through obstacles etc. Of course most aimbots also only aim on head which is a dead giveaway when all you see is 6 headshots in a row of an auto fire weapon.

Quote:

Originally Posted by m4fox90 (Post 28023514)
Feels like, maybe, we shouldn't be teaching computer programs how to recognize and kill human beings

They already did. You can read the utopia on tech sites or watch some of Level1Tech comments on these things. They can even recognize precise individuals from a heartbeat by laser, lul what, yeah, read that today. Or snoop on your phone and recognize you by walking. All this stuff is getting order of magnitude easier to do the more AI processing power there is. They can't make an algorithm to do this easily and fast so they instead train a neural network to make the algorithm for them.

Quote:

Originally Posted by huzzug (Post 28023704)
Imagine Paintball. I'd no longer need to bring friends to play with.

Great at least someone would actually be playing instead of sitting around chatting or sitting at home. The biggest obstacle to enjoying airsoft and paintball is lack of legal locations and people to play with.

m4fox90 07-02-2019 05:31 PM

Quote:

Originally Posted by ToTheSun! (Post 28023644)
Why? Worst case scenario, we'll play Battle Royale against some bots in real life.

Because they're going to kill us.

ToTheSun! 07-02-2019 05:47 PM

Quote:

Originally Posted by m4fox90 (Post 28024516)
Because they're going to kill us.

Speak for yourself. I'll be topping the leaderboard!

skupples 07-03-2019 06:30 AM

interesting...

the biggest obstacle to paintball is the cost, second is location. though most folks that're interested typically live near woods, etc to play in.

A case of good paint was $100 last time I played, a decade ago. N most these noobs run around in flame thrower mode. <3 my tippmann A5 cuz it never chopped paint, & never wasted it either. Those angel and autococker kids though... You can literally track the stream of paint back to the person throwing it, whereas with the A5 its just a quick volley, n done. You also never have to field strip the damn thing due to stuff getting in your laser eye, etc etc.

best simple marker of all time - A5.

christoph 07-03-2019 12:56 PM

ok ok, now I'm confuse, is the Antilag gaming implemented in Radeon video cards or Ryzen CPU? what video cards series and what cpu series?

KyadCK 07-03-2019 01:04 PM

Quote:

Originally Posted by christoph (Post 28025450)
ok ok, now I'm confuse, is the Antilag gaming implemented in Radeon video cards or Ryzen CPU? what video cards series and what cpu series?

GPU drivers.

Most cards made from GCN, and Navi.

christoph 07-03-2019 03:04 PM

Quote:

Originally Posted by KyadCK (Post 28025458)
GPU drivers.

Most cards made from GCN, and Navi.


ok thanks

JackCY 07-03-2019 10:34 PM

Quote:

Originally Posted by ToTheSun! (Post 28024542)
Speak for yourself. I'll be topping the leaderboard!

All the way to the sun, that's pretty high.


@skupples : yeah paintball ain't cheap compared to airsoft that's for sure, it does have that added cost, no that most airsofters spend little... but you can if you know the market which is near impossible for new to the "sport" or if you don't care and just buy something cheap.
Plus PB is quite hated by public in general due to the endless mess people playing it make. Sure AS leaves plastic around but it's hard to see, you can get biodegradable or clean up after yourself once in a while. With PB... good luck cleaning up all that paint :D

WannaBeOCer 07-07-2019 02:55 PM

Quote:

Originally Posted by Omega X (Post 28023190)
For a company that its leadership claims that its on top, they sure are very insecure.

They aren't lying, they've had an option called "Maximum pre-rendered frames" under Manage 3D Settings for years that reduces the input lag. Might not be the same way as Anti-Lag but both end up reducing input lag.

JackCY 07-07-2019 06:27 PM

Quote:

Originally Posted by WannaBeOCer (Post 28030096)
They aren't lying, they've had an option called "Maximum pre-rendered frames" under Manage 3D Settings for years that reduces the input lag. Might not be the same way as Anti-Lag but both end up reducing input lag.

It's set by most modern games to 1. And you can't go below 1 in NVCP either.

As speculated it's like pre rendered frames setting.

https://www.techpowerup.com/review/a...5700-xt/2.html

There used to be setting of 0 but that's not available and maybe not even possible for NV cards anymore. Where as AMD has hardware scheduler still and may do a 0.

I couldn't find it tested anywhere and with the again broken AMD drivers I don't blame anyone for not testing it, there are 2060S, 2070S, 5700, 5700XT, 3000 series CPUs to review all in 1-2 weeks because AMD and NV are nuts and have price fix launch at same time at same prices with near identical value.

---

Again being explained as pre rendered frames:


DX11 only. Lowering the driver delay, aka how much CPU "pre renders" frames for GPU. I bet they set it to 0 and "synced CPU and GPU" instead of having 1-3 frames buffer/pre render, 1 is common but for say CF/SLI one would use even more.

RamenRider 07-08-2019 10:04 PM

Quote:

Originally Posted by UltraMega (Post 28007352)
I don't see anything in the article to back up the claim in the headline. Is it weird that I knew who the OP was gonna be before I even opened this thread?

There are lots of ways to reduce image quality and in turn reduce input lag/increase FPS. AMD's claim is that if FPS is ~even, they will have lower input times and they backed this up with demos. I'm sure Nvidia will scramble to counter this somehow, but this isn't it.


Stop spreading propaganda for Nvidia.

Quote:

Originally Posted by VeritronX (Post 28007364)
The setting they're reffering to pretty much has to be the max pre-rendered frames one as it's the only one I can think of that's been in there that long and directly effects input lag. Pretty sure the default for most things is 3 frames (or at least I remember it being that for gta5) and you can turn it down to 1 frame these days. If you are running at 60fps locked then setting it to 1 frame would reduce your input lag by ~33.4ms.

The key difference here is AMD can reduce it to 0 frames because they still use a hardware scheduler in their gpu designs, while nvidia is stuck with a minimum of 1 frame delay because they do it in the driver on your cpu.

Quote:

Originally Posted by JackCY (Post 28007390)
The default on most games is 1 pre-rendered frame, if not do yourself a favor and force it to 1 in NVCP but who still uses 3 frames in their game engines?

I suppose Navi still has a HW scheduler and that's good I think. Nvidia moved this kind of load to CPU with Kepler hence their performance is more CPU tied and CPU limited and also loves to cause higher DPC latency as the driver takes long time sometimes/every new launch lol.

Where is that magic setting in NVCP to reduce latency? Please Nvidia do tell us, because pre-rendered frames isn't it, that's something that's supposed to be and is set by the application/game and should only be force changed in NVCP when the change works and doesn't cause issues.
Even Polaris 10 has almost 10ms advantage vs GP106, 3rd party measurements with a high speed camera. https://youtu.be/L42nx6ubpfg?t=850

RX580 vs GTX1060
16.87ms vs 25.33ms, no sync
Best cases: 16.87ms vs 18.87ms

There is latency to be cut down in drivers of both corporations. They bloat them endlessly year over year. 500MB NV driver is freakin' unreal. Not that AMD's 352MB is any better. That's some damn lot of code that one can fit into so many MBs.

The timestamp is in the comments but I found out the way NVidia reduces input lag is by reducing image quality on a hardware level. I have a slight suspicion this is where the nvidia driver DPC lag comes from.

ToTheSun! 07-09-2019 01:40 AM

Quote:

Originally Posted by RamenRider (Post 28032624)
The timestamp is in the comments but I found out the way NVidia reduces input lag is by reducing image quality on a hardware level. I have a slight suspicion this is where the nvidia driver DPC lag comes from. https://www.youtube.com/watch?v=OY8qvK5XRgA

Watch out! That DPC latency will definitely break everything in all ways possible! AFAIK, nVidia employ a software scheduler, as opposed to AMD's hardware scheduler, the former of which being the main culprit. In any case, sub-milisecond DPC latency spikes in no way influence the kind of latency one WOULD FEEL.

Now, if you could explain exactly what you mean by decreasing latency by reducing image quality "on a hardware level", I'm all ears.

JackCY 07-10-2019 09:33 AM

I will watch the PCW.

"This is our start of a new naming system. To avoid confusion." LOL that's each launch haha.

7970, old system
280x, new system same product
290, 390, 280, old system
Fury, new system
480, 580, 590 old system
Vega, new system
Radeon VII, new system
Navi, new system

They are only changing the naming to be able to charge higher price, confuse non technical customers, etc.
Now they start at 5700 so that by the time they get to 8000 they will change it again.

"Windows 1903 update scheduler improvement"
So far seems like a joke for most part when 3rd party tested. Best bet is still to have something like process lasso that will force lock programs to CCX and CCD only expanding when necessary.

NV does reduce image quality, so does AMD very likely, it was the old war when they were competitive, they tried to find any way to boost performance especially at the time by reducing texture filtering quality. Even today the defaults on NV cards are Quality = with optimizations but you can select High Quality which supposedly disables optimizations, while it also has performance and high performance for boosting those benchmark scores. There is also LOD clamping and other settings.

None of this has anything to do with DPC latency issues of Nvidia which are more likely caused by a bit of carelessness when launching new series with new driver.

I do like AMD architectures often more but it can't be denied that their focus on performance/clocks was minimal until RDNA and they pushed for parallelism instead. HW scheduler is good to have. Even with RDNA it doesn't seem that they figured out the difference between NV's arch. and theirs and as such some game engines are again dropping performance on RDNA a lot when they are optimized for Maxwell/Paxwell/Tuxwell. There are more features in Tuxwell too at the moment and that has not always been the case and I don't mean ray tracing etc. I mean the pure graphics capabilities support. H264 encoder on AMD is again an afterthought = broken and unusable quality, meanwhile I think they changed their HEVC entirely not sure but didn't they move it to shaders or something, there was video or article explaining this longer time ago, so that HEVC is now fairly beastly and they don't have a session limit as NV GeForce does. This is likely because RDNA is going into more modern streaming platforms be it XBOX, PS5 or Stadia, etc. these want to move to HEVC finally although I wonder why not AV1 at this point.


RIS is a driver toggle but CAS (Fidelity FX) the good sharpening is developer only aka devs need to use it for users to see it.
Anti lag supposedly isn't number of back buffers and we will see on July 7th what it is (well we still haven't, show us the source code then, I want to see), pre rendered frames are something else than back buffers, not recommended for AAA games (varying results) only for eSports games. I bet it's that pre rendered frames = 0 thanks to hardware scheduler, aka CPU and GPU in sync but for graphics heavy games you really need pre rendered frames 1 for smoother frametimes and performance.

If you want you can still inject ReShade and code or get your own CAS or other sharpening you like. I use fine sharp in games that need sharpening after having AA and one can tune that very well to minimize the sharpening effect. Can't stand the halo artifacts of sharpening but the finesharp in ReShade can be tuned fairly well.

WannaBeOCer 07-11-2019 03:45 PM

Quote:

Originally Posted by JackCY (Post 28030422)
It's set by most modern games to 1. And you can't go below 1 in NVCP either.

As speculated it's like pre rendered frames setting.

https://www.techpowerup.com/review/a...5700-xt/2.html

There used to be setting of 0 but that's not available and maybe not even possible for NV cards anymore. Where as AMD has hardware scheduler still and may do a 0.

I couldn't find it tested anywhere and with the again broken AMD drivers I don't blame anyone for not testing it, there are 2060S, 2070S, 5700, 5700XT, 3000 series CPUs to review all in 1-2 weeks because AMD and NV are nuts and have price fix launch at same time at same prices with near identical value.

---

Again being explained as pre rendered frames:


DX11 only. Lowering the driver delay, aka how much CPU "pre renders" frames for GPU. I bet they set it to 0 and "synced CPU and GPU" instead of having 1-3 frames buffer/pre render, 1 is common but for say CF/SLI one would use even more.

They
Quote:

removed the 0 value because it never had an impact, and is, in fact, impossible. The GPU cannot render any frames without the CPU having prepared it first.

JackCY 07-12-2019 09:14 PM

0 means it's in sync but that also means more variability and the sync isn't perfect. Where as with 1+ there is always/often a frame for GPU to process because it has been pre-rendered/pre-prepared by CPU for the GPU.

0 is possible but you may lose some smoothness of frame times and have higher average frame time.

WannaBeOCer 07-12-2019 09:16 PM

Quote:

Originally Posted by JackCY (Post 28040426)
0 means it's in sync but that also means more variability and the sync isn't perfect. Where as with 1+ there is always/often a frame for GPU to process because it has been pre-rendered/pre-prepared by CPU for the GPU.

0 is possible but you may lose some smoothness of frame times and have higher average frame time.

That quote was off of nVidia's forum which they indicate the reason they removed "0"

I would post that link again but last time I did it was removed.

JackCY 07-12-2019 11:20 PM

Ah, OK, well NV is different design than what AMD has so it's expected, NV "can't do it".
Will be nice to see if anyone with the cards and tools will dig in to figure out precisely what they improved beyond guessing what it may be.

RamenRider 07-15-2019 02:28 PM

Quote:

Originally Posted by ToTheSun! (Post 28032780)
Watch out! That DPC latency will definitely break everything in all ways possible! AFAIK, nVidia employ a software scheduler, as opposed to AMD's hardware scheduler, the former of which being the main culprit. In any case, sub-milisecond DPC latency spikes in no way influence the kind of latency one WOULD FEEL.

Now, if you could explain exactly what you mean by decreasing latency by reducing image quality "on a hardware level", I'm all ears.

Ah software scheduler. That explains it. Watch the video I linked. I don't know the exact time but it should be in the navi section. They tested the same frame to frame with different GPUs.


All times are GMT -7. The time now is 06:09 AM.

Powered by vBulletin® Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.