Introduction Borderlands 3 is finally here and with it are questions on what graphics card you'll need to run it at, so we've thrown through a bunch of graphics cards through three resolutions: 1080p, 1440p and of course, 4K.
Gearbox Software upgraded to Unreal Engine 4 with Borderlands 3, and while there is DX12 support for the game -- it has issues so our benchmarks are exclusively using DX11. The game has some great graphics for what it is, and looks pretty good if you've got everything cranked up to maximum.
There's a built-in benchmark which makes things much easier on us benchmarkers, so all results we've captured have been taken from the Borderlands 3 built-in benchmark.
AMD has its own special sauce inside of Borderlands 3 with optimizations for Radeon graphics cards, but that's more on the DX12 side of things so there is no special performance here on either side.
I've been using DX12 even though it takes a bit of time to load. It gets around 70-80 FPS in game at 1440p with my card at 1800/1200Mhz since core overclocking is still broken using 19.9.2. I'm enjoying the game so far and just finished beating it again on True Vault Hunter mode.
"Aoccdrnig to rscheearch at Cmabrigde Uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a tatol mses and you can sitll raed it wouthit a porbelm. This is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe."
how does a developer benefit from selling more GPUs?
They get a check from Nvidia or something. Not saying it's real but it is odd when a game comes out with basic graphics and manages to get almost exactly 60 fps on a 2080 ti maxed out. Just seems like an odd coincidence.
i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti