Game Benchmarks on the 3840x1600 Widescreen Monitor
Today I am going to share benchmarks of eleven different games across two different monitor resolutions the quite popular 27” 2560x1440 like the Acer Predator XB271HU I’ll be using here. It has the “overclock” feature turned on so that it can get up to 165 Hz.
The second resolution being tested will be from my recently purchased LG UltraGear 38GL950G-B 38" Curved 144 Hz G-Sync IPS Gaming Monitor. This monitor will overclock up to 175 Hz, but you have to use Chroma subsampling, so I set it to 160 Hz where I can use 8-bit color in RGB. If you drop it down to 120 Hz you can use 10-bit color.
I love testing and benchmarking new hardware, and there are not many game benchmarks published in this 3840x1600 widescreen format. Most reviews only show 1080p, 1440p and of course 4k resolutions, so this will be interesting.
This 38” 1600p widescreen has 67% more pixels than the 27” 1440p monitor 6,144,000 vs. 3,686,400 pixels.
A 4k display has 35% more pixels than the 38” 1600p widescreen 8,294,400 vs. 6,144,000 pixels.
The 3840x1600 widescreen is not a very popular resolution yet, but it is a HUGE monitor that I believe is the perfect “gaming” size and pixel wise it slots in nicely between a standard 1440p monitor and a 4k monitor. Screen real estate is more important than more pixels in my opinion.
I ran a suite of ten game benchmarks on my 27” 1440p monitor back on this post Game Benchmarks
. For this round of testing I am dropping two of the oldest titles Bioshock Infinite and Hitman Absolution. Those two games aren’t nearly as demanding as most of the newer titles, besides the fact that I’ve uninstalled those games to make room for new ones.
Now I’m adding three new games that I have not benchmarked previously The Division 2, Wolfenstein: Youngblood, and Red Dead Redemption 2 which will make a total of 11 games to look at here, and I’m not going to show a screen shot of every setting just the Ultra settings on the widescreen with a chart of the results.
Only three of these games do not support SLI or NVLink and I’ll benchmark those three games first. I have disabled SLI in the Nvidia Control panel as I found previously that I get significantly better performance at least on the Assassins Creed games with SLI disabled.
For these tests I’m using my 24/7 overclock of 4.8 GHz on the 7900x CPU, and my normal “game overclock” on the video cards which is +100 on the core clock and +1040 on the memory of the two 2080Ti video cards. During all of this testing with the ambient temperatures in the upper 60s F, the max GPU temperature was 40°C and the max CPU core temperature was 65°C.
I also am using the latest Nvidia drivers, and I ran each benchmark test at least twice for each setting. Most all of them were very close and I used the best result. I start with the settings that GeForce Experience is recommending which is listed as GFE in charts below, and I’ll use GFE from this point forward when referring to the GeForce Experience software.
In most games the GFE puts the settings to nearly maxed out, and I’ll show the “Optimal” settings that GFE is recommending and the red boxes in the GFE screen shots will show the changes if you moved the slider all the way to the “Quality”
Then I’m running the benchmarks by using the in-game Ultra, High, and Medium settings or whatever each game calls their settings.
When I am playing games I rarely use the Ultra setting because I can’t really tell any difference between High and Ultra most of the time. Of course I’m not usually standing there admiring the scenery or textures, I’m running around trying to kick some butt and not get killed.
Playing with medium settings I can see a difference, and never play on medium but I included medium settings in these benchmarks to see how much difference it makes FPS wise.
Assassins Creed Origins – October 2017
The GFE maxes the settings out on this game and the GFE scores were the same as the Ultra High setting scores. This benchmark does not list minimum and maximum FPS but does have a Score so I included that in the chart. This benchmark runs just over two minutes in length.
The “WS” in all these charts stands for WideScreen with the 1440p resolution to the right with the Average Frames per Second.
Assassins Creed Odyssey – October 2018
This is the most demanding game of all the games tested here with only 57 FPS average with Ultra settings. If you dragged the slider all the way to “Quality”
you can see it changes the Ambient Occlusion from Medium to High, the Environment Texture Detail from High to Ultra High, and the Resolution Modifier from 100% to 200%. The Resolution Modifier is just a FPS killer, and I would never use it personally.
I would use the High setting here with a very playable 76 FPS average.
Wolfenstein: Youngblood - July 2019
One reason I wanted to benchmark the Wolfenstein: Youngblood game is because it uses the new updated DLSS. Here is an interesting article on the subject where they call it DLSS 2.0 Nvidia DLSS in 2020: Stunning Results
Here is a quote from that article:
“DLSS now works with all RTX GPUs, at all resolutions and quality settings, and delivers effectively native image quality when upscaling, while actually rendering at a lower resolution. It's mind blowing. It's also exactly what Nvidia promised at launch. We're just glad we’re finally getting to see that now.”
You can see that the “Optimal” setting in GFE has the Anti-aliasing controlled by the DLSS. The in-game default setting for the DLSS is “Quality” and Ray Tracing is also on by default, so that is the settings I use here.
I wasn’t able to capture a screen shot of the completed benchmark for some reason, so just a chart here showing both of the two different built-in benchmarks. The second benchmark results do not have much variance between the different settings which make the second benchmark fairly meaningless.
Also this game has two settings above the standard Ultra with is Uber and Mein liben! Interesting that using the GFE turns DLSS off and the FPS drops significantly at the 1440p resolution.
With five different levels of settings the difference between the highest setting and the lowest setting is only seven FPS! I suspect that is related to the DLSS somehow. I haven’t played this game yet, but I know I’ll be getting plenty of FPS even on this big 3840 x 1600 monitor no matter what setting I choose.
8 Games that support SLI
The rest of the games all support SLI so I’ve turned the SLI on again. I start with the oldest of the games remaining and move to the newer ones. I have rounded off all the FPS scores in the charts to the nearest whole number.
Middle Earth Shadow of Mordor - September 2014
GFE experience maxes this game out, and it gets an average of 214 FPS. The first time I ran this I had forgotten to turn SLI back on and it only got 120 FPS which is still a lot, but excellent scaling in SLI with this game. This game gets 247 average FPS in the 1440p resolution, and with that kind of frame rate there is no reason to even benchmark anything less than the highest settings.
Rise of the Tomb Raider - November 2015
The only difference in the GFE settings from “Optimal” to max “Quality”
is the Anti-aliasing.
This benchmark shows the average, minimum, and maximum FPS in three different sections of the game so I’m just going to chart the overall score in FPS rounded off. This benchmark runs about 1 ½ minutes.
Notice only five FPS difference between the two screen resolutions at the “High” setting.
Tom Clancy's Ghost Recon Wildlands - March 2017
For this game the only thing GFE would change going fully to the “Quality”
side of the slider is to change the Long Range Shadows form On to Ultra and change the Resolution Scaling higher.
This is the second most demanding game in this suite of benchmarks getting only 66 FPS in Ultra. Although moving down to Very High bumps the average FPS up to 94 FPS, a 28 FPS increase!
Just six FPS difference at the Very High setting between the two resolutions. This benchmark runs about a minute and ten seconds.
Far Cry 5 - March 2018
GFE maxes everything except the Resolution Scaling. This game calls the Medium setting “Normal”. The benchmark runs right about one minute in length.
No matter the setting you choose getting over 100 FPS on average is awesome!
Shadow of the Tomb Raider - September 2018
Looking at the GFE there are four settings that can be increased by moving the slider all the way to “Quality”.
With the “eye candy” turned on with the “highest” in-game setting I’m getting 137 FPS! That is only seven FPS less that the 1440p monitor gets.
This game has DLSS which tested out really well in my previous testing at 1440p, but if you turn it on with the widescreen monitor it takes away the 3840 x 1600 resolution option and only gives you a 2560 x 1600 option, so I won’t be testing DLSS here.
There is plenty of FPS here no matter what setting you choose as there is not much difference between the “Highest” setting to the “Normal” setting. This benchmark runs three minutes long.
Far Cry New Dawn - February 2019
As in Far Cry 5 this game also sees the GFE maxing everything out except the Resolution Scaling. This benchmark runs about one minute in length.
Once again the “High” setting would be my choice of settings here.
The Division 2 – March 2019
Most all of the “SLI compatible” games will have a chart like this one. This is MSI Afterburner Monitor showing each card with a separate column. From the top row is GPU temperatures, GPU usage, GPU power, GPU core clock, with Framerate in the lower right. This is Far Cry New Dawn. See how in both columns the pattern is almost exactly the same. This shows three benchmark runs back to back
For the Division 2, I’ll call it “Partially SLI compatible” because it only uses 20-30% of the second video card as seen in this Afterburner chart. I did run these benchmarks with SLI disabled and the scores were quite a bit lower, so the second card definitely helps but does not scale as well as the rest of the games here.
If you maxed out the GFE slider here it changes the Ambient Occlusion from High to Very High, Extra Streaming Distance from 8 to 10, and the Object Detail setting from 80 to 100.
This is the third most taxing game here as far as FPS go with 71 FPS in Ultra with a nice bump up to 97 FPS by merely dropping down to the High setting.
This benchmark runs about one minute 40 seconds, and this game does not have minimum or maximum FPS in the benchmark, but does have a score.
Dead Red Redemption 2 - December 2019
The built in benchmark in this game is the longest of all of these games tested going four minutes in length. That doesn’t sound like much, but if you are testing out many different settings several times each it takes much longer than the games with shorter benchmarks.
This game also has more settings that you can tweak than any game I’ve ever seen! Plus the game does not have “global” Ultra, High, and Medium settings. Most games if you change from Ultra to High it will change numerous video settings.
Here is the default settings, you have to go down the line and change each setting down from Ultra.
The “Optimal” settings in GFE are lower on the slider than any other game tested here, and sliding it all the way to the “Quality”
side changes nearly every setting and there is twice that many settings if I scrolled down.
This is one game that makes using GFE great because you don’t have to figure out which settings to change you and just drag the slider up or down to get more “Quality”
or more FPS.
The “End of Benchmark” screen isn’t very exciting, but you can see even at Ultra settings I’m getting nearly 105 FPS on the widescreen.
I haven’t even played this game yet but in Steam it says I’ve played for five hours, and that is just from running benchmarks. The benchmark looks so awesome that I’ve decided that this will be the next game I play.
As usual this ended up being way more work to put together than I thought. I spent weeks benchmarking and compiling all the data. I was originally planning to do all the benchmarks on the widescreen and then just using the previous data I had from benchmarking the games at 1440p.
But I decided to redo all of those benchmarks previously done on the 1440p monitor because the video drivers have been updated, and this way I could use all the same in-game settings for a better comparison between the two resolutions.
I must admit that after running all the benchmarks on the widescreen and then going back to the 27” monitor to retest all the benchmarks, that 27” monitor seems dinky compared to the widescreen. Also the additional height of the 38” widescreen makes a HUGE difference!
The widescreen is so much more immersive in games, and it serves as two monitors for general productivity use. I have two windows open side by side almost always.
Here is one final chart with all eleven games with both Ultra or whatever is the highest setting, and High or whatever is the second highest setting showing the average FPS of the widescreen 3840x1600 monitor in the left two columns, and the 2560x1440 resolution monitor in Ultra and High in the right two columns.
In conclusion using the GFE sometimes gets more FPS than Ultra settings, and sometimes gets less FPS than Ultra settings. It is certainly not necessary to use, but can come in handy on games with many different settings you can adjust like Red Dead Redemption 2.
Otherwise unless you are playing a game with crazy high FPS like Shadow of Mordor, it is best to skip using the Ultra setting and use the in-game High settings to get more FPS. I have found that if you can keep your average FPS at least in the 80 to 90 FPS range that you should have a smooth gaming experience with G-Sync, and anything over 100 FPS is totally awesome!
Hope you enjoyed this!