Overclock.net banner
1 - 20 of 21 Posts

kevindd992002

· Registered
Joined
·
11,075 Posts
Discussion starter · #1 ·
I didn't install GeForce Experience so I simply enabled the 1440p and 4K resolutions in NVCP and kept smoothness to its 33% default. I have a 1080p monitor and am expecting that using DSR will downsample without making the icons/text smaller and without making the screen blurry.

I tried it on CS:GO and it makes everythibg small and blurry. Am I not suppose to just notice that everything is sharper because it renders at 1440p/4K and still outputs 1080p? Am I doing this wrong?

Please help. Thanks.
 
DSR will shrink text and HUD.

To correct the blurring lower the smoothness level to around 20% or lower depending on what you are aiming for.
 
Discussion starter · #3 ·
I see. Is DSR a technology that's commonly used for 1080p monitors?
 
Discussion starter · #4 ·
BUMP!
 
Quote:
Originally Posted by kevindd992002 View Post

I didn't install GeForce Experience so I simply enabled the 1440p and 4K resolutions in NVCP and kept smoothness to its 33% default. I have a 1080p monitor and am expecting that using DSR will downsample without making the icons/text smaller and without making the screen blurry.

I tried it on CS:GO and it makes everythibg small and blurry. Am I not suppose to just notice that everything is sharper because it renders at 1440p/4K and still outputs 1080p? Am I doing this wrong?

Please help. Thanks.
Yes you are wrong in how this works, to be fair, nvidia marketed it wrong, as in to expect a linear increase in quality with each step of resolution increase. The geforce performance reviews are especially misleading.
You have to scale to a shared divisor of your current resolution, or your screen will be a mess. If you are on 1080p your only clean option is to scale to 4k, since then every pixel will have been quartered and rasterizing an immage from that is easy as pie.

If it is not an even number though, you have to interpolate, which is horrible.* Nvidia solved this rather awesomely, by doing billinear filtering, but with a scalable kernel. But even this will always give you either a sharp image and unnaccaptable aliasing or the lesser of two evils... Blurriness. And a small step in resolution from native is a massive step backwards, since you cannot properly interpolate in time.

So mulitply your resolution by at LEAST 2 or an integer after that in each direciton and use that. For 1080 it is 1080*2=2160, aka 4k. Turn OFF DSR smoothness. That will solve the bulriness and give you increased quality. If you cannot pull of 4k, (and dsr is even heavier, due to multiple framebuffers being worked with and the aforementioned interpolation) just stick to 1080 and increase MSAA. If you want more, you can force better algorithms in Nvidia Inspector.

edit: *edit technically you have to anyway, but a shared divisor solves most problems,
 
Discussion starter · #6 ·
Quote:
Originally Posted by lPizzal View Post

Yes you are wrong in how this works, to be fair, nvidia marketed it wrong, as in to expect a linear increase in quality with each step of resolution increase. The geforce performance reviews are especially misleading.
You have to scale to a shared divisor of your current resolution, or your screen will be a mess. If you are on 1080p your only clean option is to scale to 4k, since then every pixel will have been quartered and rasterizing an immage from that is easy as pie.

If it is not an even number though, you have to interpolate, which is horrible.* Nvidia solved this rather awesomely, by doing billinear filtering, but with a scalable kernel. But even this will always give you either a sharp image and unnaccaptable aliasing or the lesser of two evils... Blurriness. And a small step in resolution from native is a massive step backwards, since you cannot properly interpolate in time.

So mulitply your resolution by at LEAST 2 or an integer after that in each direciton and use that. For 1080 it is 1080*2=2160, aka 4k. Turn OFF DSR smoothness. That will solve the bulriness and give you increased quality. If you cannot pull of 4k, (and dsr is even heavier, due to multiple framebuffers being worked with and the aforementioned interpolation) just stick to 1080 and increase MSAA. If you want more, you can force better algorithms in Nvidia Inspector.

edit: *edit technically you have to anyway, but a shared divisor solves most problems,
Thanks for the insight.

I thought 4K is 4.00x native resolution (1080p)? At least that's what's in NVCP which is I think is the multiplier for the total number of pixels. If you're multiplying the vertical and horizontal resolution by 2, then you're correct in 3840 x 2160 (4K) for 1080p though.

So what can I do with the text and icons being small when I apply 4K DSR in games like CSGO and Overwatch? Would it also change my "aim" in these games because of the smaller crosshair?

When you say turn off smoothness, does that mean I need to slide the bar all the way to 0%?

And since I'm using DSR, which AA settings can I lower down (or even disable) as the downsampling process in itself produces more or less the same effect as these?
 
Quote:
Originally Posted by kevindd992002 View Post

I thought 4K is 4.00x native resolution (1080p)? At least that's what's in NVCP which is I think is the multiplier for the total number of pixels. If you're multiplying the vertical and horizontal resolution by 2, then you're correct in 3840 x 2160 (4K) for 1080p though.
Doubling the sides quadrouples the area. If you "Only" want to double the area, you would have to choose Sides* square root(2). Aka 2715,2900397504 * 1527,3506473596. Good luck interpolating that ;D
Having two Pixels (or any other INTEGER) to average in each direction makes rasterizing easy and artifact free, so this is the only real way to imporve image quality without messy interpolation.
Quote:
Originally Posted by kevindd992002 View Post

So what can I do with the text and icons being small when I apply 4K DSR in games like CSGO and Overwatch?
Nothing. You render at 4k, the game thinks you are at 4k. If the game fixes elements based on resolution (which is the easiest way), you will get smaller UI with bigger resolutions. Usually games started implementing UI scaling because of this. If they have not, you are out of luck.

edit: I recall 4k mods for some games, which force a bigger resolution for UI elements, by upscaling them and putting the texture back into the game. Might want to dig for something like that, or mod games yourself.
Quote:
Originally Posted by kevindd992002 View Post

Would it also change my "aim" in these games because of the smaller crosshair?
Aim is based on movement in degrees, not pixels. (unless some bizzare and idiotic developers did it otherwise), so while on the menu screen your mouse moves half the speed at 4k vs 1080, in 3D it should stay the same, since the calculation goes from DPI straight into moved angle. Only visually the crosshair shrinks, but you can easily scale it up back to "normal".
Quote:
Originally Posted by kevindd992002 View Post

When you say turn off smoothness, does that mean I need to slide the bar all the way to 0%?
Yes
Quote:
Originally Posted by kevindd992002 View Post

And since I'm using DSR, which AA settings can I lower down (or even disable) as the downsampling process in itself produces more or less the same effect as these?
It produced definetly "more" than MSAA. A doubling in resolution is a bit more than MSAA on edges and more info overall. In a perfect world 4xMSAA would sample Vector lines up to 4k resolution standard. In reallife it doesn't, since the samples are not distributed perfectly towards the line. Also MSAA does not sample everything else, like complex shaders. This is the reason developers disabled MSAA in newer deffered engines, since implementing MSAA there is hella hard. So in a perfect mathematical sense, only looking at lines 4k is the same with lines as 4x MSAA. Since this is nowhere near the real case it's above that (~ 4xmsaa + shader AA) and different from game to game. So just switch back and fourth and find what it really amounts to.

As you see tricking the game into believing you run at double the res brings many problems. I usually end up just increasing quality through nvidia inspector.

edit: mixed up sample counts for MSAA
 
Quote:
Originally Posted by lPizzal View Post

Yes you are wrong in how this works, to be fair, nvidia marketed it wrong, as in to expect a linear increase in quality with each step of resolution increase. The geforce performance reviews are especially misleading.
You have to scale to a shared divisor of your current resolution, or your screen will be a mess. If you are on 1080p your only clean option is to scale to 4k, since then every pixel will have been quartered and rasterizing an immage from that is easy as pie.

If it is not an even number though, you have to interpolate, which is horrible.* Nvidia solved this rather awesomely, by doing billinear filtering, but with a scalable kernel. But even this will always give you either a sharp image and unnaccaptable aliasing or the lesser of two evils... Blurriness. And a small step in resolution from native is a massive step backwards, since you cannot properly interpolate in time.

So mulitply your resolution by at LEAST 2 or an integer after that in each direciton and use that. For 1080 it is 1080*2=2160, aka 4k. Turn OFF DSR smoothness. That will solve the bulriness and give you increased quality. If you cannot pull of 4k, (and dsr is even heavier, due to multiple framebuffers being worked with and the aforementioned interpolation) just stick to 1080 and increase MSAA. If you want more, you can force better algorithms in Nvidia Inspector.

edit: *edit technically you have to anyway, but a shared divisor solves most problems,
yep put DSR smoothness at 0% and some games it just doesnt work well in at all, Dungeon Siege III is one game it works really well at though for some reason, lol just fyi
 
Discussion starter · #9 ·
Quote:
Originally Posted by lPizzal View Post

Doubling the sides quadrouples the area. If you "Only" want to double the area, you would have to choose Sides* square root(2). Aka 2715,2900397504 * 1527,3506473596. Good luck interpolating that ;D
Having two Pixels (or any other INTEGER) to average in each direction makes rasterizing easy and artifact free, so this is the only real way to imporve image quality without messy interpolation.
Nothing. You render at 4k, the game thinks you are at 4k. If the game fixes elements based on resolution (which is the easiest way), you will get smaller UI with bigger resolutions. Usually games started implementing UI scaling because of this. If they have not, you are out of luck.

edit: I recall 4k mods for some games, which force a bigger resolution for UI elements, by upscaling them and putting the texture back into the game. Might want to dig for something like that, or mod games yourself.
Aim is based on movement in degrees, not pixels. (unless some bizzare and idiotic developers did it otherwise), so while on the menu screen your mouse moves half the speed at 4k vs 1080, in 3D it should stay the same, since the calculation goes from DPI straight into moved angle. Only visually the crosshair shrinks, but you can easily scale it up back to "normal".
Yes
It produced definetly "more" than MSAA. A doubling in resolution is a bit more than MSAA on edges and more info overall. In a perfect world 4xMSAA would sample Vector lines up to 4k resolution standard. In reallife it doesn't, since the samples are not distributed perfectly towards the line. Also MSAA does not sample everything else, like complex shaders. This is the reason developers disabled MSAA in newer deffered engines, since implementing MSAA there is hella hard. So in a perfect mathematical sense, only looking at lines 4k is the same with lines as 4x MSAA. Since this is nowhere near the real case it's above that (~ 4xmsaa + shader AA) and different from game to game. So just switch back and fourth and find what it really amounts to.

As you see tricking the game into believing you run at double the res brings many problems. I usually end up just increasing quality through nvidia inspector.

edit: mixed up sample counts for MSAA
I'm beginning to understand better. Regarding the type of AA you use, do you consider MSAA the most practical (or even best) to use? I thought FXAA is even better?

So if I understand correctly, DSR is always better than increasing AA setting but because of the numerous problems it presents sticking at the native resolution with increased AA will do, correct?

On another note, I'm playing GOW4 now and have another question. If you use DSR together with Dynamic Resolution Scaling on a 1080p monitor, do you get as smooth FPS as possible? With this mode, do you still need to enable G-SYNC if the game tries to maintain a smooth FPS by dynamically changing the resolution?
 
Quote:
Originally Posted by kevindd992002 View Post

I'm beginning to understand better. Regarding the type of AA you use, do you consider MSAA the most practical (or even best) to use? I thought FXAA is even better?

So if I understand correctly, DSR is always better than increasing AA setting but because of the numerous problems it presents sticking at the native resolution with increased AA will do, correct?
The Best AA is the one, that gives you the most samples for a pixel. It so happens, that MSAA is the best at that, since it works best at a given resolution, as the industry over the years has figured it how to puke samples all over your image. With newer shading techniques and the deffered rendering method this became unfourtently not efficent any more, as for each new shader effect, that produces flickering stuff needs to be sampled on it's own agian. You cannot simply start puking samples for gerometry lines, then for shader a , then for shader b, it becomes too much. So the industry abbandoned this idea and the only way in newer engines to get more samples for a more clean and refined image is to super sample, aka downsample aka.
So MSAA is not practical anymore unfourtenetly in newer games. (although it still is in the source engine)

FXAA does not increase the sample count at all. It basicly does the following: Photoshop Filter -> find edges. Where edge is found Blur screen by 2 pixels. It literally just blurs the lines and is thus not a very smart thing to do. It basicly lowers the entropy and thus your perception of aliasing. If the image is small you won't notice the difference and the aliasied lines will be gone. So win win. On a big screen, of if you are close to the monitor, you will notice blurring aka "the vaseline" effect, as some dubbed it. So fxaa does not "increase" image quality it just decreases information in parts where you will notice aliasing by blurring it.

Super sampling, or DSR is the best way of doing things, as you simple put more information into the final image, so it comes out cleaner. But rendering everything 4 times as much with 4k is simply to taxing and UI scaling sucks across the board, pepare for small text, Hell even Windows still has not figured it out, with blurred menus here and there. There are other smart algorithms which do AA in smarter ways, like SMAA, which defines aliased Lines based on shapes and blurs accordingly, instead of the "brute force" blur all FXAA.
A Blue L shaped pixel arrangement will receive more blur, than a red step pixel arrangement, since the perceived aliasing in a L shape is bigger, fun stuff like this. (SMAA if obviously smarter than that, but this is the gist of it,/ read here for more detail) So SMAA for instance is even better than FXAA.

The newest trend is to average across multiple frames, aka temporal AA, with many flavors here aswell. Everyone tries to somehow cheat putting more samples into the image, since doing so is so taxing. So the smarter the algorhythms get, the better they are at hiding aliasing, instead of removing it.
Quote:
Originally Posted by kevindd992002 View Post

On another note, I'm playing GOW4 now and have another question. If you use DSR together with Dynamic Resolution Scaling on a 1080p monitor, do you get as smooth FPS as possible? With this mode, do you still need to enable G-SYNC if the game tries to maintain a smooth FPS by dynamically changing the resolution?
Enabling G-Sync is always a good idea, since it removes the tearing vs delay+stutter with V-sync problem. So enable it always, it makes stuff look smooth at 59 fps, just as much at 60, where at 59 fps not being a integer devisor of a 60hz monitor was always ****ty resulting in sutter by drop to an integer divior, aka stutter because drop to 30, or tearing. G-Sync cheats this by lowering the refresh rate of the monitor to 59 aswell, win win.

As for DSR vs dynamic resolution, see for yourself, how it acts and what it does. If you know it will down res, then don't do it, as a consitant gaming at 1080 is better than jumping back and forth from 4k to 1080 and back.
 
Discussion starter · #11 ·
Quote:
Originally Posted by lPizzal View Post

The Best AA is the one, that gives you the most samples for a pixel. It so happens, that MSAA is the best at that, since it works best at a given resolution, as the industry over the years has figured it how to puke samples all over your image. With newer shading techniques and the deffered rendering method this became unfourtently not efficent any more, as for each new shader effect, that produces flickering stuff needs to be sampled on it's own agian. You cannot simply start puking samples for gerometry lines, then for shader a , then for shader b, it becomes too much. So the industry abbandoned this idea and the only way in newer engines to get more samples for a more clean and refined image is to super sample, aka downsample aka.
So MSAA is not practical anymore unfourtenetly in newer games. (although it still is in the source engine)

FXAA does not increase the sample count at all. It basicly does the following: Photoshop Filter -> find edges. Where edge is found Blur screen by 2 pixels. It literally just blurs the lines and is thus not a very smart thing to do. It basicly lowers the entropy and thus your perception of aliasing. If the image is small you won't notice the difference and the aliasied lines will be gone. So win win. On a big screen, of if you are close to the monitor, you will notice blurring aka "the vaseline" effect, as some dubbed it. So fxaa does not "increase" image quality it just decreases information in parts where you will notice aliasing by blurring it.

Super sampling, or DSR is the best way of doing things, as you simple put more information into the final image, so it comes out cleaner. But rendering everything 4 times as much with 4k is simply to taxing and UI scaling sucks across the board, pepare for small text, Hell even Windows still has not figured it out, with blurred menus here and there. There are other smart algorithms which do AA in smarter ways, like SMAA, which defines aliased Lines based on shapes and blurs accordingly, instead of the "brute force" blur all FXAA.
A Blue L shaped pixel arrangement will receive more blur, than a red step pixel arrangement, since the perceived aliasing in a L shape is bigger, fun stuff like this. (SMAA if obviously smarter than that, but this is the gist of it,/ read here for more detail) So SMAA for instance is even better than FXAA.

The newest trend is to average across multiple frames, aka temporal AA, with many flavors here aswell. Everyone tries to somehow cheat putting more samples into the image, since doing so is so taxing. So the smarter the algorhythms get, the better they are at hiding aliasing, instead of removing it.
Ok. So which is better between TXAA and SMAA? And is SMAA natively an option in most modern games now? What does Nvidia Inspector do in these AA settings (I haven't used it before)?
Quote:
Originally Posted by lPizzal View Post

Enabling G-Sync is always a good idea, since it removes the tearing vs delay+stutter with V-sync problem. So enable it always, it makes stuff look smooth at 59 fps, just as much at 60, where at 59 fps not being a integer devisor of a 60hz monitor was always ****ty resulting in sutter by drop to an integer divior, aka stutter because drop to 30, or tearing. G-Sync cheats this by lowering the refresh rate of the monitor to 59 aswell, win win.

As for DSR vs dynamic resolution, see for yourself, how it acts and what it does. If you know it will down res, then don't do it, as a consitant gaming at 1080 is better than jumping back and forth from 4k to 1080 and back.
Yeah, I get what you mean.

So what you're essentially saying is a fixed resolution (w/ variable FPS but GSYNC enabled) is better than a dynamic resolution (w/ fixed FPS and GSYCN enabled)?

The reason I was asking about this is because how I read about Dynamic Resolution Scaling in Nvidia's GOW 4 performance guide here: http://www.geforce.com/whats-new/guides/gears-of-war-4-graphics-and-performance-guide#gears-of-war-4-dynamic-resolution-scaling . They always it market these features as "better" (better to use DSR together with Dynamic Resolution Scaling) and I'm not sure if that's true.
 
Quote:
Originally Posted by kevindd992002 View Post

Ok. So which is better between TXAA and SMAA? And is SMAA natively an option in most modern games now?.
TXAA does some fun hybrid trickery. They get MSAA information and handle edges that way, get temporal information of the previous frame to AA shaders and to smooth out edges even more than MSAA already does to prevent crawling lines even more.
It is the best AA solution for removing crawling lines, but it juggles multiple inputs and interpolates quite hard. To hide that it blurs a lot. It's very taxing and scarifices kinda every best part of each individual technique and combines all of the cons, just to remove crawling lines. Although it IS better than SMAA, as it DOES introduce more information.
This AA algorhytm was very ambitios, trying to combine the best of both worlds, but ended up creating a blurry mess.
The sharpness of more samples is lost due to hard temporal blurring and the quickness of fast blur methods is lost due to sampling from msaa.

SMAA is here and there, I recall Crysis 3 and some other games. You can use an SMAA injector to force a game to use that alogrhytm. Just google game an SMAA injector and you may or may not find a program, that adds this feature. Some developers don't bother as the difference between a properly implemented and well chosen FXAA is small compared to SMAA, many games argue against that and just say it's lazy design, as although SMAA does have more dials and settings, that you have to customize for your game, it is objectivly better and smarter than FXAA. Inspector gives you access to NVIDIAs Driver's hidden front end, where you can tweak tons of stuff. They have a lot of driver side AA implementations with different tweaks here and there. Just google all that is possible. You can do shinanigans like 32xMsaa, where you just puke tons of samples into the scene and get diminishing results of a better image.
Quote:
Originally Posted by kevindd992002 View Post

So what you're essentially saying is a fixed resolution (w/ variable FPS but GSYNC enabled) is better than a dynamic resolution (w/ fixed FPS and GSYCN enabled)?

The reason I was asking about this is because how I read about Dynamic Resolution Scaling in Nvidia's GOW 4 performance guide here: http://www.geforce.com/whats-new/guides/gears-of-war-4-graphics-and-performance-guide#gears-of-war-4-dynamic-resolution-scaling . They always it market these features as "better" (better to use DSR together with Dynamic Resolution Scaling) and I'm not sure if that's true.
First of all, FPS is never fixed. If it were we would have solved 90% of problems in realtime computing
biggrin.gif
FPS is always varaible. Sometimes games are faster, sometimes slower.
What GSYNC ON does, is make the Refreshrate of the monitor the same as the current fps. This solves problems, where FPS are not a common divisor of the refresh rate and all problems that gives you like tearing. This can be solved by locking the fps top a common divisor. If you cannot fill 60fps anymore, you can drop down to 30 and display a frame every other monitor refresh. This sucks though and sutters like crazy.

As for the nvidia claim:
It makes sense, that if you game at 1080 above 60fps, that you can get sometimes 4k on low demand scenes and thus improve the Image quality, when the scene is not demanding.
I argue against that.
The jumping between two qualities is immersion breaking for me. The counter argument to mine is, that during fight scenes you don't noctice the drop, since there is so much action. I would retort and say a consitant experience is more important, than switching back and forth, as such a process is not perfect.
The opinion on this is subjective. Nvidia obviously has a bias in this, to praise their technology. What you think is best depends in the on you.
My opinion stays, consitant is better than variable for immersion reasons.
 
I used TXAA if the game supported. TXAA eliminate crawling aliasing when character in-game moving, but the image seem become more blurred.

For MSAA, I combined MSAA with MFAA in nvcp, try MSAA 4x/2x + MFAA On .
MSAA 4x+MFAA on= MSAA 8x quality, with MSAA 4x performances.
MSAA 2x+MFAA on= MSAA 4x quality, with MSAA 2x performances.

SMAA SSAA will increase your internal resolution and smooth the aliasing naturally like when you increase your resolution. But it will likely cripple performances if too high.

Edit: yeah i mean SSAA.
 
Discussion starter · #15 ·
Quote:
Originally Posted by lPizzal View Post

TXAA does some fun hybrid trickery. They get MSAA information and handle edges that way, get temporal information of the previous frame to AA shaders and to smooth out edges even more than MSAA already does to prevent crawling lines even more.
It is the best AA solution for removing crawling lines, but it juggles multiple inputs and interpolates quite hard. To hide that it blurs a lot. It's very taxing and scarifices kinda every best part of each individual technique and combines all of the cons, just to remove crawling lines. Although it IS better than SMAA, as it DOES introduce more information.
This AA algorhytm was very ambitios, trying to combine the best of both worlds, but ended up creating a blurry mess.
The sharpness of more samples is lost due to hard temporal blurring and the quickness of fast blur methods is lost due to sampling from msaa.

SMAA is here and there, I recall Crysis 3 and some other games. You can use an SMAA injector to force a game to use that alogrhytm. Just google game an SMAA injector and you may or may not find a program, that adds this feature. Some developers don't bother as the difference between a properly implemented and well chosen FXAA is small compared to SMAA, many games argue against that and just say it's lazy design, as although SMAA does have more dials and settings, that you have to customize for your game, it is objectivly better and smarter than FXAA. Inspector gives you access to NVIDIAs Driver's hidden front end, where you can tweak tons of stuff. They have a lot of driver side AA implementations with different tweaks here and there. Just google all that is possible. You can do shinanigans like 32xMsaa, where you just puke tons of samples into the scene and get diminishing results of a better image.
First of all, FPS is never fixed. If it were we would have solved 90% of problems in realtime computing
biggrin.gif
FPS is always varaible. Sometimes games are faster, sometimes slower.
What GSYNC ON does, is make the Refreshrate of the monitor the same as the current fps. This solves problems, where FPS are not a common divisor of the refresh rate and all problems that gives you like tearing. This can be solved by locking the fps top a common divisor. If you cannot fill 60fps anymore, you can drop down to 30 and display a frame every other monitor refresh. This sucks though and sutters like crazy.

As for the nvidia claim:
It makes sense, that if you game at 1080 above 60fps, that you can get sometimes 4k on low demand scenes and thus improve the Image quality, when the scene is not demanding.
I argue against that.
The jumping between two qualities is immersion breaking for me. The counter argument to mine is, that during fight scenes you don't noctice the drop, since there is so much action. I would retort and say a consitant experience is more important, than switching back and forth, as such a process is not perfect.
The opinion on this is subjective. Nvidia obviously has a bias in this, to praise their technology. What you think is best depends in the on you.
My opinion stays, consitant is better than variable for immersion reasons.
Yeah, I should have not used the the term "fixed FPS", my bad. Does a game usually only have one type of AA available to it? And if I enable 4K DSR, would enabling any type of AA do more harm than good?

I see. I guess it's really a "to each his own" kinda thing regarding the dynamic resolution topic.
 
Quote:
Originally Posted by kevindd992002 View Post

Does a game usually only have one type of AA available to it? And if I enable 4K DSR, would enabling any type of AA do more harm than good?
Usually games have a big chunk of AA techniques available. With many flavors and performance impacts. With newer engines stuff, that throws more samples at the problem tend no to exist, because they are too difficult to properly manage. So we see many games implemented with only post process ones, like MLAA, SMAA; FXAA etc...
Some engines have actually managed both, like the cry engine and the frostbyte engine, but there the performance impact is very painful. Crysis 3 with msaa.... good luck.
But some ****ty console ports don't bother adding these option. When a game has only "AA on and AA off, it is a really lazy port"

Enabling AA with DSR should always be an improvement, unless blur comes into play, since even post process blurring methods don't blur as much. On your native screen res fxaa might blur 4 (2x2 kernel) Pixels together. On 4k it still blurs 4 pixels, but you only see 1 being "blurred", so no blur produced, since you have 4 times the res. However, this is different from case to case and the biggest problem is obviously performance. Just like you quadrupel the res, now AA has Quadroupel the performance impact, since it works 4 times as much.
Quote:
Originally Posted by kevindd992002 View Post

I guess it's really a "to each his own" kinda thing regarding the dynamic resolution topic.
So is all of PC graphics, some prefer resolution over Quality over Framerate and some balance the three differently. Which is the reason, why flamewars over this have started ;D
The best is what you like most, don't get dragged by another opinion.
 
Discussion starter · #18 ·
Quote:
Originally Posted by lPizzal View Post

Usually games have a big chunk of AA techniques available. With many flavors and performance impacts. With newer engines stuff, that throws more samples at the problem tend no to exist, because they are too difficult to properly manage. So we see many games implemented with only post process ones, like MLAA, SMAA; FXAA etc...
Some engines have actually managed both, like the cry engine and the frostbyte engine, but there the performance impact is very painful. Crysis 3 with msaa.... good luck.
But some ****ty console ports don't bother adding these option. When a game has only "AA on and AA off, it is a really lazy port"

Enabling AA with DSR should always be an improvement, unless blur comes into play, since even post process blurring methods don't blur as much. On your native screen res fxaa might blur 4 (2x2 kernel) Pixels together. On 4k it still blurs 4 pixels, but you only see 1 being "blurred", so no blur produced, since you have 4 times the res. However, this is different from case to case and the biggest problem is obviously performance. Just like you quadrupel the res, now AA has Quadroupel the performance impact, since it works 4 times as much.
So is all of PC graphics, some prefer resolution over Quality over Framerate and some balance the three differently. Which is the reason, why flamewars over this have started ;D
The best is what you like most, don't get dragged by another opinion.
Alright. Is Texture Filtering different from AA? And what does Motion Blur do?

What do I get if I DSR at 1440p instead of 4K? Isn't 1440p DSR better than no DSR at all?
 
Texture filtering is different. So that textures don't flicker games use a technique called Mip-Mapping, where each texture has always a version, that is half its size down to the last pixel. Blending between those is what texture filtering does. If it does it by pixel coverage, you get dumb things like textures losing Sharpness, when looked at an angle. Anisotropic filtering creates angles versions on the fly and applies them. Modern cards do this without performance impact, so you can set it to max and you are fine, rarely can you even measure a performance impact. Below Aniso 4x things start to look ugly, when looking at flat floors and walls.
Motion Blur, google it, it simply blurs when you move to create more immersion with movement. But it's usually horribly over done, so many people just turn it off. It also helps at low framerates, selling a move and hiding low frame rates when moving the camera (where it is most apparent). This is the reason consoles use ton of it, since it can hide bad performance when using a velocity based controling input, like a controller. When at high framerate it tends to detract from the experience for some, for others it adds immersion. Technically it is blur past the motion, which is not how it works in reallife. Post process motion blur is biased towards one side of time. So it's not the real thing from real life.

1440p DSR... if you can live with lower framerate and marginal increase in performance, whilst handling small UI combined with possible loss in sharpness, go ahead. It is a subjective thing. I would not do so, it has many drawbacks.
 
Discussion starter · #20 ·
Quote:
Originally Posted by lPizzal View Post

Texture filtering is different. So that textures don't flicker games use a technique called Mip-Mapping, where each texture has always a version, that is half its size down to the last pixel. Blending between those is what texture filtering does. If it does it by pixel coverage, you get dumb things like textures losing Sharpness, when looked at an angle. Anisotropic filtering creates angles versions on the fly and applies them. Modern cards do this without performance impact, so you can set it to max and you are fine, rarely can you even measure a performance impact. Below Aniso 4x things start to look ugly, when looking at flat floors and walls.
Motion Blur, google it, it simply blurs when you move to create more immersion with movement. But it's usually horribly over done, so many people just turn it off. It also helps at low framerates, selling a move and hiding low frame rates when moving the camera (where it is most apparent). This is the reason consoles use ton of it, since it can hide bad performance when using a velocity based controling input, like a controller. When at high framerate it tends to detract from the experience for some, for others it adds immersion. Technically it is blur past the motion, which is not how it works in reallife. Post process motion blur is biased towards one side of time. So it's not the real thing from real life.
Ok. I just find it hard to notice all of these differences when you're already immersed in the game (especially FPS games). That's just me though, I'm sure there are a lot of you guys that really notice these. So all is good. I mean it's not something like tearing or FPS dipping that's very easy to recognize in-game.
 
1 - 20 of 21 Posts