Hey everyone. I compared the performance of manually OCed and tuned DDR4 vs DDR5 to see if the cost difference of $1180 CAD and reduction in memory capacity by 32GB was justified. I got kind of lucky with my CPU and could OC my Crucial 2x32GB 3200 c16 Micron Rev.B kit to 4400 c16 1T gear1 stable. First and foremost this comparison is for my personal use case to evaluate if going with DDR5 is worth it, I just thought I'd share my results with everyone else. I used CapFrameX to capture the data ingame and compare results, except for MW2 in which I used the ingame benchmark. Also note these results are all at 1440p as I don't have a 4K monitor.
To start here are the system specs, I've used the exact same CPU and GPU, NVME drive and power supply, with the difference being motherboard and memory kit.
CPU was cooled with a heatkiller IV block on a mora and the gpu is currently on air. They were left stock just to normalize everything except the memory. Click the links on the ram descriptions to see screenshots of the stable profiles.
General Specs
Modern Warfare 2 (2022) on High/Ultra settings without DLSS or upscaling
8000 c36 DDR5 results
4400 c16 DDR4 results
No real variance here as this is within error and run to run variance
Cyberpunk 2077 on Ultra with RT, DLSS on Quality, Crowd density set to High
Results from the the built-in benchmark:
8000 c36 DDR5
4400 c16 DDR4
Results from actual gameplay testing in which I did the exact same thing for both
The Witcher 3 v1.32 - Maxed out
I took the readings on my old 5900X and 3080 system for comparison before the RT update, since I was looking at these for personal use I stuck to version 1.32 for comparitive purposes.
F1 22 - Bahrain Dry in-game benchmark, everything maxed out with relevant RT things on
Fallout 4 - Life in the Ruins modlist from Wabbajack, it's heavily modded basically
SkyrimSE - Elysium Remastered modlist from wabbajack that is also very heavily modded
I think the results are kind of interesting in that there's a negligible performance difference between my DDR4 and DDR5 setup. Surprisingly I have much better 0.1% lows with the DDR4 setup than I do with the DDR5 setup, and this is consistent across various games. I honestly have no idea as to why that is.
I wish I had more time to bench other games but I only looked at those that are relevant to me and ones I would be playing. The DDR5 setup absolutely did not justify a hefty cost premium and a decrease in memory capacity by 32GB. I think there's a case to me made that if you have good ddr4 and can manually OC it, just stick with that instead of early adopting DDR5 unless you enjoy tinkering with the latest hardware. I will personally be on my current DDR4 setup until 2-3 years from now where better, cheaper and more mature DDR5 along with better boards and memory controllers are out. Thank you for reading.
To start here are the system specs, I've used the exact same CPU and GPU, NVME drive and power supply, with the difference being motherboard and memory kit.
CPU was cooled with a heatkiller IV block on a mora and the gpu is currently on air. They were left stock just to normalize everything except the memory. Click the links on the ram descriptions to see screenshots of the stable profiles.
General Specs
- CPU: 13900K - completely stock with power limits disabled
- GPU: RTX 4090 FE - stock
- NVME Drive: WD sn850 2TB
- PSU: Corsair HX1200
- OS: Windows 11 pro 22H2 fresh install with respective drivers installed
- GPU Driver: Nvidia driver 527.56
- Motherboard: MSI z790 Edge Wifi DDR4
- Motherboard: ASUS z790 Apex
- RAM: G Skill 2x16 7200 c34 Hynix a-die oced to 8000 c36 stable
Modern Warfare 2 (2022) on High/Ultra settings without DLSS or upscaling
8000 c36 DDR5 results

4400 c16 DDR4 results

No real variance here as this is within error and run to run variance
Cyberpunk 2077 on Ultra with RT, DLSS on Quality, Crowd density set to High
Results from the the built-in benchmark:
8000 c36 DDR5

4400 c16 DDR4

Results from actual gameplay testing in which I did the exact same thing for both
The Witcher 3 v1.32 - Maxed out
I took the readings on my old 5900X and 3080 system for comparison before the RT update, since I was looking at these for personal use I stuck to version 1.32 for comparitive purposes.
F1 22 - Bahrain Dry in-game benchmark, everything maxed out with relevant RT things on
Fallout 4 - Life in the Ruins modlist from Wabbajack, it's heavily modded basically
SkyrimSE - Elysium Remastered modlist from wabbajack that is also very heavily modded
I think the results are kind of interesting in that there's a negligible performance difference between my DDR4 and DDR5 setup. Surprisingly I have much better 0.1% lows with the DDR4 setup than I do with the DDR5 setup, and this is consistent across various games. I honestly have no idea as to why that is.
I wish I had more time to bench other games but I only looked at those that are relevant to me and ones I would be playing. The DDR5 setup absolutely did not justify a hefty cost premium and a decrease in memory capacity by 32GB. I think there's a case to me made that if you have good ddr4 and can manually OC it, just stick with that instead of early adopting DDR5 unless you enjoy tinkering with the latest hardware. I will personally be on my current DDR4 setup until 2-3 years from now where better, cheaper and more mature DDR5 along with better boards and memory controllers are out. Thank you for reading.