Overclock.net banner
1 - 20 of 29 Posts

·
Registered
Joined
·
268 Posts
Discussion Starter · #1 · (Edited)
I swapped out my 2x16 B-die, for new 2x8 B-die (to test before putting into my daily) and I lost 18fps in the Assassins Creed Odyssey benchmark.

I'm not sure why this would happen?

Is there a reason why different B-die running at the same speed would give a different FPS count?

It's running with the same settings and the figures in the BIOS have trained identically.

Is my 2x16 kit just that much better?

I need to swap my 2x16 kit back in to test for user error; this differential seems extreme. I'm even questioning if the System Requirements published for ACO are correct, and whether it actually benefits from 32GB rather than 8GB?

I'm just worried I'm not going to get those 18fps back. :rolleyes: Is it a training issue?

Font Slope Space Screenshot Software
 

·
Registered
Joined
·
268 Posts
Discussion Starter · #3 ·
DR vs SR. DR helps in games. To compare all timings and frequency need to be the same.
It's just that CPU-Z is showing it as dual rank? I was hoping that's what it was. Do you think CPU-Z might be outputting an incorrect value?

Edit: Hang on...is CPU-Z saying it's dual channel...?!...not dual rank?

...so; the reason for the drop in performance is that it's single rank?
 

Attachments

·
Registered
Joined
·
612 Posts
I swapped out my 2x16 B-die, for new 2x8 B-die (to test before putting into my daily) and I lost 18fps in the Assassins Creed Odyssey benchmark.

I'm not sure why this would happen?

Is there a reason why different B-die running at the same speed would give a different FPS count?

It's running with the same settings and the figures in the BIOS have trained identically.

Is my 2x16 kit just that much better?

I need to swap my 2x16 kit back in to test for user error; this differential seems extreme. I'm even questioning if the System Requirements published for ACO are correct, and whether it actually benefits from 32GB rather than 8GB?

I'm just worried I'm not going to get those 18fps back. :rolleyes: Is it a training issue?

View attachment 2544343
Very unlikely, given that your fps indicates that you very strongly GPU bound! I used to get 90-110 average fps at 1440p with "sensible" settings in AC: Odyssey with a 10600K @4.9 GHz all core and 1080Ti (then 10900KF and a 3080).

Seems more likely that your in-game settings changed/reset. Perhaps settings were changed by Nvidia experience automatically? For example, was resolution scaling set to 1.4x before? That's going to absolutely eat fps.

Also, if you have latest drivers installed that support the new DLDSR feature, then I have a played with it; DLDSR definitely helps with image quality. Enable the 3413 x 1920 DLDSR resolution in Nvidia control panel, then try the settings below and report back:

3413 x 1920 resolution (DLDSR for 1440p screen, really nice image quality)
Resolution scaling x 1.0 (not 1.4x)
Anti-aliasing: low (DLDSR makes AA pointless, but do not use "off" in this game as it kills fps for some reason)
Volumetric clouds: low (higher eats fps for little benefit)

DLDSR should offer the best image quality for the lowest performance impact, but try 2560 x 1440p resolution (with lower resolution scaling) to see how much DLDSR costs in terms of fps on your Titan RTX.

Edit: the fact that you get the same performance AGAIN with the new vs old kit means that the performance change you initially reported was indeed due to a change in the game settings, and NOT the change of RAM.
 
  • Rep+
Reactions: GenkiM

·
Registered
Joined
·
268 Posts
Discussion Starter · #7 ·
Very unlikely, given that your fps indicates that you very strongly GPU bound! I used to get 90-110 average fps at 1440p with "sensible" settings in AC: Odyssey with a 10600K @4.9 GHz all core, and 1080Ti (then 10900KF and a 3080).

Seems more likely that your settings changed/reset. Perhaps settings were changed by Nvidia experience automatically? For example, was resolution scaling set to 1.4x before? That's going to absolutely eat fps.

Also, if you have latest drivers installed that support the new DLDSR feature, then I have a played with it; DLDSR definitely helps with image quality. Enable the 3413 x 1920 DLDSR resolution in Nvidia control panel, then try the settings below and report back:

3413 x 1920 resolution (DLDSR for 1440p screen, really nice image quality)
Resolution scaling x 1.0 (not 1.4x)
Anti-aliasing: low (DLDSR makes AA pointless, but do not use "off" in this game as it kills fps for some reason)
Volumetric clouds: low (higher eats fps for little benefit)

DLDSR should offer the best image quality for the lowest performance impact), but try 2560 x 1440p resolution (with lower resolution scaling) to see how much DLDSR kills fps on your Titan RTX.
I think it has got to be a settings problem, I'll have to run some more tests and make sure the settings stay the same. (y)

Edit: Also, thanks for the advice re. settings; I still don't have a first-hand, experiential understanding of how all of that stuff effects the image & frame rate.
 

·
Registered
Joined
·
612 Posts
I think it has got to be a settings problem, I'll have to run some more tests and make sure the settings stay the same. (y)

Edit: Also, thanks for the advice re. settings; I still don't have a first-hand, experiential understanding of how all of that stuff effects the image & frame rate.
Out of curiosity, I just tried 1440p (1.4x resolution scaling) vs. 3413 x 1920 (1.0x in-game resolution scaling and DLDSR). The first setting gave 96 fps, and the DLDSR setting gave 94 fps. However, the DLDSR setting looked a LOT crisper for near identical performance.

Also you need driver 511.X or later to enable DLDSR. I see you are using an old driver.
 

·
Cheesebumps!!
Joined
·
3,452 Posts
I swapped my 2x16 kit back in and...more odd results?

View attachment 2544350
Dual channel differs from Dual Rank..

Dual and single RANK refers to the row of chips on the DDR4 sticks..Single Rank means only 1 side of your DDR4 stick is populated with memory die's, Dual Rank means its populated with memory die's on both sides..

Dual rank performs or rather WILL always perform better on gaming (specially on scenarios/workloads that can make use of memory interleaving)..
 

·
Registered
Joined
·
612 Posts
Dual channel differs from Dual Rank..

Dual and single RANK refers to the row of chips on the DDR4 sticks..Single Rank means only 1 side of your DDR4 stick is populated with memory die's, Dual Rank means its populated with memory die's on both sides..

Dual rank performs or rather WILL always perform better on gaming (specially on scenarios/workloads that can make use of memory interleaving)..
Not when 100% GPU bound... 55 fps in this game with a 10600K is 100% GPU bound.
 
  • Rep+
Reactions: Arni90

·
Cheesebumps!!
Joined
·
3,452 Posts
Not when 100% GPU bound... 55 fps in this game with a 10600K is 100% GPU bound.
that is why I said "specially on scenario's/workloads that can make USE of memory interleaving", no guarantees or what so ever, it would really depend on the application you will run..

making 4 sticks of 8gb (Single Rank) DDR4 sticks would also work the same as having 2x16GB sticks (Dual Rank) though it will/or can introduce a payload on your IMC specially if you will aim to overclock it.. (simply means you need to add more IMC/SA voltage, and or adjust some on other areas as well and also vdimm)
 

·
Registered
Joined
·
268 Posts
Discussion Starter · #12 · (Edited)
Very unlikely, given that your fps indicates that you very strongly GPU bound! I used to get 90-110 average fps at 1440p with "sensible" settings in AC: Odyssey with a 10600K @4.9 GHz all core and 1080Ti (then 10900KF and a 3080).

Seems more likely that your in-game settings changed/reset. Perhaps settings were changed by Nvidia experience automatically? For example, was resolution scaling set to 1.4x before? That's going to absolutely eat fps.

Also, if you have latest drivers installed that support the new DLDSR feature, then I have a played with it; DLDSR definitely helps with image quality. Enable the 3413 x 1920 DLDSR resolution in Nvidia control panel, then try the settings below and report back:

3413 x 1920 resolution (DLDSR for 1440p screen, really nice image quality)
Resolution scaling x 1.0 (not 1.4x)
Anti-aliasing: low (DLDSR makes AA pointless, but do not use "off" in this game as it kills fps for some reason)
Volumetric clouds: low (higher eats fps for little benefit)

DLDSR should offer the best image quality for the lowest performance impact, but try 2560 x 1440p resolution (with lower resolution scaling) to see how much DLDSR costs in terms of fps on your Titan RTX.

Edit: the fact that you get the same performance AGAIN with the new vs old kit means that the performance change you initially reported was indeed due to a change in the game settings, and NOT the change of RAM.
Thanks :0)

It turns out the resolution scaling had got turned back on somehow, so, panic over. :0)

Out of curiosity, I just tried 1440p (1.4x resolution scaling) vs. 3413 x 1920 (1.0x in-game resolution scaling and DLDSR). The first setting gave 96 fps, and the DLDSR setting gave 94 fps. However, the DLDSR setting looked a LOT crisper for near identical performance.

Also you need driver 511.X or later to enable DLDSR. I see you are using an old driver.
Re. DLDSR; will that be added to the game menu when I update the video driver? It's something different to resolution scaling?

How did you work out the increase for the resolution scaling?

I just dropped my resolution to 1080p and boosted resolution scaling to 1.8 (3686400 (number of pixels in 1440p resolution) is 177.77777777778% of 2073600 (number of pixels in 1080p resolution)). The detail looked okay, but there was quite a lot of what I can only assume is what people call micro-stutter...?...it was a bit juddery.

That having been said, I did some benchmarking with AC Odyssey, and I'm getting about 14fps increase from my RAM overclock but adding in my CPU overclock just seems to make things worse: it's giving me a +1 to average frame rate but reducing my minimum and maximum frame-rates by 8FPS. That's at 1080p on low settings with no resolution scaling. Is it possible that I'm still GPU-bound at those settings?

It's made me wonder if there's a gaming scenario in which having overclocked RAM is actually going to make a difference? Firstly, I have to be gaming at 1080p, which is likely, because I game in stereoscopic 3D, so I need a minimum of 120fps; but if I'm playing at 1080p because I'm already GPU-bound, then is having fast RAM really going to help? I guess if the GPU is fast enough to retain some overhead at 1080p, but slow enough to have hit its limits at 1440p, then it might be beneficial to have fast/overclocked RAM...?...but what are the odds that this set of circumstances will ever occur?

Also, yes, re. the driver. I hadn't realised it hadn't been updated since my last Windows install; I'm going to update it now and run a few more benchmarks. (y)
 

·
Registered
Joined
·
612 Posts
Thanks :0)

It turns out the resolution scaling had got turned back on somehow, so, panic over. :0)

Re. DLDSR; will that be added to the game menu when I update the video driver? It's something different to resolution scaling?

How did you work out the increase for the resolution scaling?

I just dropped my resolution to 1080p and boosted resolution scaling to 1.8 (3686400 (number of pixels in 1440p resolution screen) is 177.77777777778% of 2073600 (number of pixels in 1080p resolution)). The detail looked okay, but there was quite a lot of what I can only assume is what people call micro-stutter...?...it was a bit juddery.

That having been said, I did some benchmarking with AC Odyssey, and I'm getting about 14fps increase from my RAM overclock but adding in my CPU overclock just seems to make things worse: it's giving me a +1 to average frame rate but reducing my minimum and maximum frame-rates by 8FPS. That's at 1080p on low settings with no resolution scaling. Is it possible that I'm still GPU-bound at those settings?

It's made me wonder if there's a gaming scenario in which having overclocked RAM is actually going to make a difference? Firstly, I have to be gaming at 1080p, which is likely, because I game in stereoscopic 3D, so I need a minimum of 120fps; but if I'm playing at 1080p because I'm already GPU-bound, then is having fast RAM really going to help? I guess if the GPU is fast enough to retain some overhead at 1080p, but slow enough to have hit its limits at 1440p, then it might be beneficial to have fast/overclocked RAM...?...but what are the odds that this set of circumstances will ever occur?

Also, yes, re. the driver. I hadn't realised it hadn't been updated since my last Windows install; I'm going to update it now and run a few more benchmarks. (y)
How to use DLDSR:
Step 1. Install driver 511.X or later.
Step 2. Open Nvidia control panel. Manage 3D settings. DSR Factors. Under DL scaling tick 1.78x (3413 x 1920) and 2.25 x (3840 x 2160).
Step 3. Load your game of choice, and select either of the DLDSR resolutions above from within the game. (For full screen borderless games -i.e. DX12 games- you have to change the desktop to your desired DLDSR resolution before opening the game).
Step 4. Enjoy the enhanced quality.

How to tell if you are GPU or CPU bound in AC: Odyssey
1. Press F1 until you see GPU usage in the HUD (works in game and during benchmark).
2. If GPU usage is >97% you are GPU bound. Otherwise you are CPU bound.

How to tell if you are GPU or CPU bound in any game.
1. Install and run Afterburner.
2. Ensure that GPU Usage monitoring graph is enabled in AfterBurner.
3. If GPU usage is >97% you are GPU bound. Otherwise you are CPU bound.
NB/ It is unlikely that you would be CPU bound at 120 fps in any game currently released, but there are examples of badly coded /single-threaded games. The original Crysis comes to mind, where I couldn't break 40 fps once explosions started happening even on a 10600K.

How to find your maximum CPU-bound framerate / quality in any game.
1. Run with low graphics settings at 720p.
2. Increase graphics resolution to your desired resolution.
3. Increase graphics settings until you start to dip below your desired framerate.
NB/ I have sometimes found gaming to be more stuttery when CPU-bound (vs GPU bound). In these cases, increasing graphics setting to get into a GPU-bound state is beneficial to smoothness.
 
  • Rep+
Reactions: GenkiM

·
Registered
Joined
·
268 Posts
Discussion Starter · #14 ·
How to use DLDSR:
Step 1. Install driver 511.X or later.
Step 2. Open Nvidia control panel. Manage 3D settings. DSR Factors. Under DL scaling tick 1.78x (3413 x 1920) and 2.25 x (3840 x 2160).
Step 3. Load your game of choice, and select either of the DLDSR resolutions above from within the game. (For full screen borderless games -i.e. DX12 games- you have to change the desktop to your desired DLDSR resolution before opening the game).
Step 4. Enjoy the enhanced quality.
I'm having trouble getting the DLDSR options to display.

I've downloaded the driver and configured Nvidia control panel as per the above instructions, but I'm not seeing an option to choose DLDSR...?...do I just select the resolution with the resolution slider?
 

·
Registered
Joined
·
612 Posts
I'm having trouble getting the DLDSR options to display.

I've downloaded the driver and configured Nvidia control panel as per the above instructions, but I'm not seeing an option to choose DLDSR...?...do I just select the resolution with the resolution slider?
It never says "DLDSR" in, it just works when you select one of the DLDSR resolutions in any game. If the resolution sticks, the DLDSR is working. I noticed an immediate jump in resolution crispness. However, if you are targeting 120 fps you will likely not be able to hit that without an RTX 4090 or RTX 5080!
 
  • Rep+
Reactions: GenkiM

·
Registered
Joined
·
268 Posts
Discussion Starter · #16 · (Edited)
It never says "DLDSR", it just works when you select one of the DLDSR resolutions. If the resolution sticks, the DLDSR is working. I noticed an immediate jump in resolution crispness. However, if you are targeting 120 fps you will likely not be able to hit that without an RTX 4090 or RTX 5080!
😊 AC Odyssey doesn't have a 3D fix yet; I have to wait until the 3D community makes one for it. 😊

Really I'm just using it for benchmarking, but yes, it does look really good. I followed the above steps and selected 2160p with the resolution slider and it seems to have worked? The judder is still there towards the beginning of the benchmark but I have to check whether it would be there regardless of the resolution I have it set to.

I'm getting just over 60fps average in that resolution too...it's pretty much just within the limits of my hardware.

For stereoscopic 3D, I'd have to drop the resolution and the graphics settings, but DLDSR would probably really help to bring the quality back up (I think it's compatible with 3D?)

Edit: I'm going to run the benchmark with my RAM frequency dropped to stock, to see if it makes any difference to frame-rate now that I'm GPU-locked.

Edit: Also, thanks for showing me that; I almost certainly would never have found that. :rolleyes:
 

·
Registered
Joined
·
612 Posts
😊 AC Odyssey doesn't have a 3D fix yet; I have to wait until the 3D community makes one for it. 😊

Really I'm just using it for benchmarking, but yes, it does look really good. I followed the above steps and selected 2160p with the resolution slider and it seems to have worked? The judder is still there towards the beginning of the benchmark but I have to check whether it would be there regardless of the resolution I have it set to.

I'm getting just over 60fps average in that resolution too...it's pretty much just within the limits of my hardware.

For stereoscopic 3D, I'd have to drop the resolution and the graphics settings, but DLDSR would probably really help to bring the quality back up (I think it's compatible with 3D?)

Edit: I'm going to run the benchmark with my RAM frequency dropped to stock, to see if it makes any difference to frame-rate now that I'm GPU-locked.

Edit: Also, thanks for showing me that; I almost certainly would never have found that. :rolleyes:
The judder at the start of the benchmark and in many other games is because your CPU is trying to run the game AND load assets at the same time (you see the same thing in Batman Arkham Knight benchmark for example).

Also for AC: Odyssey and any other CPU-intensive games be sure to make sure that CPU priority is set to "high". Task Manager, Details, Right click on the .exe and select CPU priority "high". I installed "Prio (Process Priority Saver)" to save my priority settings so I only need to do this once per game. It can help with stutter.

I used to play with 3D vision for a while, then found that I much prefer playing games at ~100 fps or higher. I also noticed that 3D vision 2 kits were selling for >£230 on eBay, and sold mine (back when I had a 1080Ti and the drivers stopped supporting 3D vision). They seem to be selling for less these days, Are you forced to use an old driver to get 3D vision to work on your card?
 
  • Rep+
Reactions: GenkiM

·
Registered
Joined
·
268 Posts
Discussion Starter · #18 ·
The judder at the start of the benchmark and in many other games is because your CPU is trying to run the game AND load assets at the same time (you see the same thing Batman Arkham Knight benchmark for example).

Also for AC: Odyssey and any other CPU-intensive games be sure to make sure that CPU priority is set to "high". Task Manager, Details, Right click on the .exe and select CPU priority "high". I installed "Prio (Process Priority Saver)" to save my priority settings so I only need to do this once per game. It can help with stutter.
Thanks! Again...no clue that even existed. :rolleyes::((y)
 

·
Registered
Joined
·
268 Posts
Discussion Starter · #19 ·
The judder at the start of the benchmark and in many other games is because your CPU is trying to run the game AND load assets at the same time (you see the same thing in Batman Arkham Knight benchmark for example).

Also for AC: Odyssey and any other CPU-intensive games be sure to make sure that CPU priority is set to "high". Task Manager, Details, Right click on the .exe and select CPU priority "high". I installed "Prio (Process Priority Saver)" to save my priority settings so I only need to do this once per game. It can help with stutter.

I used to play with 3D vision for a while, then found that I much prefer playing games at ~100 fps or higher. I also noticed that 3D vision 2 kits were selling for >£275 on eBay, and sold mine (back when I had a 1080Ti and the drivers stopped supporting 3D vision). Are you forced to use an old driver to get 3D vision to work on your card?
I've got a "Set Priority" option...is that the one? I'm not seeing a "Set CPU Priority" option.
 

·
Registered
Joined
·
612 Posts
I've got a "Set Priority" option...is that the one? I'm not seeing a "Set CPU Priority" option.
Should be. The label might be changed after installing "Prio" (I also have "save priority", which you won't have unless you installed Prio).
 
  • Rep+
Reactions: GenkiM
1 - 20 of 29 Posts
Top