Overclock.net banner
21 - 40 of 47 Posts
Lets put it this way, what else is your choice today?

Really it’s the 6800XT or the 3080.

Though, with recent rumours, you may want to wait anyways since the 3080 Ti is around the corner. Est. “February” launch, that is if you can even find one...

None of these cards will guarantee 100fps in all games at your resolution but these are your only options today. Ignore the 3090 / 6900 XT unless you have money to blow.
 
realworld gaming and not youtube dufus
2080ti 1440p was barely enough for 60fps min guaranteed
3080 best gpu for 1440p atm generally 60-120

3090 . is 20%-30% faster
thats just at 480w 1950-1980 min clocks

so yeah you are gonna need a 3090

i have both. 3080 2160/21500 is too slow.
Depends on the games played, I upgraded from a 1440p monitor to a 4k since the Titan RTX easily handles Destiny 2, Doom Eternal, Wolfenstein Youngblood, Death Stranding, Shadow of the Tomb Raider, Call of Duty Black Ops Cold War at 4k max settings without RT/DLSS. Even with RT/DLSS enabled the Titan RTX can handle SOTTR/Wolfenstein and COD with 60+ FPS at max. If a Titan RTX can handle these titles, it's obvious a RTX 3080 would definitely.

Then we have the compute heavy games that the RTX 20/30 series can't run above 60+ FPS at 4K native with max settings like open world titles and Control which even without DxR enabled still uses a form of ray tracing called voxel cone tracing along with heavy use of SSR.

If I had a RTX 3090 I would just end up upscaling these games like I was doing with the 1440p monitor I had.
 
That's rediculous. A 3080 is more than enough for a less than 4k resolution. You won't get super high fps with ray tracing enabled but thats also true for a 3090. For all practical purposes a 3080 or even 3070 would be fine.
Listen to UltraMega here. They are right on.

5120x1440(UW)= ~7.4mpx
3840x2160(4K)= ~8.3mpx

That puts your resolution ~12% lower than 4k. However with render variation between resolutions, that percentage won't directly correlate to the performance difference.

Look at performance of the 3080 in 4k in the games you want / play and that will give you an idea for the resolution that your new monitors will be pumping.

Donny D
 
3080 owner here, using it at 4K120.

It IS a 4K GPU, but it is NOT a 4K max everything GPU. Case in point, Borderlands 3 at 4K max settings will struggle to maintain 60 FPS. However, drop a few settings, and it can do it easily. Cyberpunk 2077, which is arguably the most GPU punishing game on the market, is only setup for 30 FPS on Nvidia's "max" settings in GFE... and that's with DLSS enabled. No GPU configuration currently available can run Cyberpunk at 4K maxed without DLSS. It would be unbearably painful. However, I used Digital Foundry's optimized settings at 4K, and the game runs like a dream.

Because you are using Super Ultrawide 1440P, I would say you're probably going to be in the same boat. However, even a 3090 would be in utter agony in the scenarios listed above, so to everyone claiming that the 3080 Ti w/ 20GB will magically fix the 3080 10GB's issue... it won't. No chance. If you're planning on keeping the GPU for 4+ years, the 10GB vram might be an issue and I would recommend a 3090 or a (soon to be) 3080 Ti.

If you can temper your expectations to high-very high settings at Super Ultrawide 1440P, then I would say go for it. The 3080 is a beastly GPU, and 10GB vram is more than enough.
 
  • Rep+
Reactions: MadGoat
A 3080 has absolutely no chance of hitting 100 fps on very high or ultra settings in all games at 5120x1440, per the OPs question. Also, the OP has SLI Titans, so I'm leaning forward and recommending a card that is SLI compatible in case he happens upon a 2nd card sometime in the future. Perhaps a 3080 is fine for "all practical purposes" but that's not the question.
SLI is dead. Only the 3090 has the connectors for it anymore, so buying a card expecting any multi-GPU support in the future is a bad idea and a huge waste of time.
 
and get over 100fps in all Games on very high or ultra settings? and when will they start being in stock again?
Currently I have 2 Titan X Pascals in Sli and would it be worth the upgrade to a 3080?
I doubt it. If you're going to blow cash on a new card make the smart choice and wait until the Ti variants come out. For the past, what, 7 years, everyone jumps on the initial cards then bitches when NVidia drops the Tis later as if it's surprising.

The Tis will likely still have issues meeting your requirements but it will be the wiser purchase in the long term. Concerns about availability are valid though, so you'll have to remain vigilant so you can get one before they sell out, perhaps even pre-order if you find a place that does that.
 
  • Rep+
Reactions: MadGoat
SLI is dead. Only the 3090 has the connectors for it anymore, so buying a card expecting any multi-GPU support in the future is a bad idea and a huge waste of time.
This argument is like flies on a turd whenever someone mentions SLI or better yet, NVlink. Not that I watch turds alot but.. here goes.

SLI, or the improved SLI, NVlink, is dead because people keep buying their hugely over priced cards. SLI =/= Nvlink. NVlink is a great technology which works flawlessly on Quadros. Nvidia butchered that interface on RTX cards. The gamer market. Because they make more profit on overpriced top end single card sales. Looking at the numbers(Nvidia is a public company, numbers are readily available) each 3090 gains ~$600 net profit per card sold. 3080 is $200 net profit per card sold. 3060 is ~$60 net profit per card sold. If customer gains in performance were Nvidias driving factor they would have implemented NVlink to run badass on all RTX cards. And still make a profit, albeit a bit less.

Something to think about.

FWIW I'm selling my 2070s and leaving Nvidia based on these numbers.

As for OPs question: NO, a 3080 wont kick ass at 100FPS on ultra with that monitor. You're gonna have to sell a kidney and buy the extremely over priced 3090 for that.
 
This argument is like flies on a turd whenever someone mentions SLI or better yet, NVlink. Not that I watch turds alot but.. here goes.

SLI, or the improved SLI, NVlink, is dead because people keep buying their hugely over priced cards. SLI =/= Nvlink. NVlink is a great technology which works flawlessly on Quadros. Nvidia butchered that interface on RTX cards. The gamer market. Because they make more profit on overpriced top end single card sales. Looking at the numbers(Nvidia is a public company, numbers are readily available) each 3090 gains ~$600 net profit per card sold. 3080 is $200 net profit per card sold. 3060 is ~$60 net profit per card sold. If customer gains in performance were Nvidias driving factor they would have implemented NVlink to run badass on all RTX cards. And still make a profit, albeit a bit less.

Something to think about.

FWIW I'm selling my 2070s and leaving Nvidia based on these numbers.

As for OPs question: NO, a 3080 wont kick ass at 100FPS on ultra with that monitor. You're gonna have to sell a kidney and buy the extremely over priced 3090 for that.
SLI is dead because Nvidia has stopped supporting it.

 
  • Rep+
Reactions: masterdev
each 3090 gains ~$600 net profit per card sold. 3080 is $200 net profit per card sold. 3060 is ~$60 net profit per card sold.
Nvidia has been making good use of those profits. Nobody realizes how far these boards have come within their various price points. I don't mind spending $2k on a card to support their R&D. After all, first class tickets are what keep coach prices down. You're welcome.

$500 Nvidia in 2004:
2472441


$500 Nvidia in 2020:
2472442


And if you adjust for inflation, that would be a picture of a 3080, not a 3070.
 
This argument is like flies on a turd whenever someone mentions SLI or better yet, NVlink. Not that I watch turds alot but.. here goes.

SLI, or the improved SLI, NVlink, is dead because people keep buying their hugely over priced cards. SLI =/= Nvlink. NVlink is a great technology which works flawlessly on Quadros. Nvidia butchered that interface on RTX cards. The gamer market. Because they make more profit on overpriced top end single card sales. Looking at the numbers(Nvidia is a public company, numbers are readily available) each 3090 gains ~$600 net profit per card sold. 3080 is $200 net profit per card sold. 3060 is ~$60 net profit per card sold. If customer gains in performance were Nvidias driving factor they would have implemented NVlink to run badass on all RTX cards. And still make a profit, albeit a bit less.

Something to think about.

FWIW I'm selling my 2070s and leaving Nvidia based on these numbers.

As for OPs question: NO, a 3080 wont kick ass at 100FPS on ultra with that monitor. You're gonna have to sell a kidney and buy the extremely over priced 3090 for that.
SLI(No longer supported like Crossfire) hasn't changed just the bandwidth due to the switch to the interconnect NVlink. Nvidia removed NVlink from their other consumers cards due to the use of professional applications like V-Ray benefiting from it. Clearly the RTX 3090 price is due to professions use. Look at the amount of blower cards with front power connectors. You can complain about the price but professionals can now buy a CUDA enabled GPU with 24GBs of memory for $1500. While AMD completely dropped AMD ROCm support and increased the price of their GPUs.

implicit multi-gpu solutions like Crossfire/SLI died due to developers wanting more access to the GPUs which they now have with DX12/Vulkan.
 
I knew that post would trigger some fanboys. Not interested in bickering back and forth on this, thats a waste of time, and nobody can post a rebuttal on my post to prove it's not true. So carry on by yourselves on if you wish, Ladies.
 
I knew that post would trigger some fanboys. Not interested in bickering back and forth on this, thats a waste of time, and nobody can post a rebuttal on my post to prove it's not true. So carry on by yourselves on if you wish, Ladies.
You started your post calling people turds then complained about card prices. I completely rebutted your nonsense by teaching you about the garbage cards we dealt with back in the day and how you can now get a 3080 for the same price we paid in 2004 for a green PCB with a blower made out of apple cores and newspaper.
 
I knew that post would trigger some fanboys. Not interested in bickering back and forth on this, thats a waste of time, and nobody can post a rebuttal on my post to prove it's not true. So carry on by yourselves on if you wish, Ladies.
First of all, your post doesn't pass the "smell test". Nvidia is a publicly owned company, so if they sell more product, they make more money. Period. Second, the reason Nvidia stopped supporting SLI is because DirectX 12 gives developers close to metal control over how a GPU runs code, which means developers could implement their own multi-GPU support solution at the hardware level instead of having to rely on Nvidia's middleware-like approach. Third, the problem with SLI is simple; 2 cards alternating frame renders causes lag, and Nvidia has been making a big push as of recent to reduce lag and input latency. Thus, SLI is counter to their current model for performance.

Therefore, it is in Nvidia's best interest moving forward to concentrate more on raw performance per core as opposed to multi-GPU performance. I had 2 SLI setups before this (a 2x780 SLI and a 2x1070 SLI), and in over half of the situations that I used them for, frame-pacing was terrible. I moved away from multi-GPU, and never looked back.

Any other questions, or are you going to continue with your baseless insults towards members of this community?
 
I have that monitor and an EVGA 3090 FTW3 Ultra. In COD with everything maxed and DLSS on quality I get 80-90FPS. With DLSS off and RT maxed it drops to 50-60FPS. Doom Eternal is well over 100, Cyberpunk 2077 is pretty low, I can't remember exactly I was very disappointed in that game and didn't play it very long.

No, going by what I've seen with a 3090, a 3080 will not be enough to drive said monitor to 100FPS+ with high/ultra settings. A 3090 can barely do it in most AAA games.

When discussing gaming there really isn't much point of even mentioning NvLink. What new games support? I've had 2 SLI rigs before and I loved them. When they worked it was a decent boost to FPS and they just look amazing. I don't even think the new COD supports it and COD was one the AAA games that usually supported SLI. I take that as a pretty good sign gaming on NVLink is dead in the water. Believe me I wish it wasn't.
 
For reference...
I have a stock 3080 Tuf OC air cooled and play BFV with all settings on high except mesh which is ultra and I get ~130 FPS which I cap @ 120 because I run 3440*1440 120Hz monitor
I tried ultra but the frame rate was dipping below 110 and gameplay wasn't as smooth
Using DX11
 
I doubt it will be enough for all AAAA titles, (having 3090FE paired with 2560*1440 in AC Valhalla on all ultra) but not every game even supports 2 x 1440p resolution.
 
That's rediculous. A 3080 is more than enough for a less than 4k resolution. You won't get super high fps with ray tracing enabled but thats also true for a 3090. For all practical purposes a 3080 or even 3070 would be fine.
This is soooo not true! The bandwidth is definitely more than a 4k@60 monitor. It's a fact! This monitor is almost as power hungry as 4k@144 monitor.
How's 5120 x 1440 @ 100Hz 10-bit less than 4k bandwidth? You're not even taking into account that this is a 10-bit monitor.

The 3080 would be the perfect card for the OP's monitor for under 90FPS in AAA games.
I would consider RTX 3080 as the minimum. :D
I would get the 3090 or wait for 3080Ti.

Samsung CRG9 can run at 100Hz at 10bit or 120Hz at 10-bit, but with color subsampling (4-2-2), which not an option, if quality is mandatory.
So:
4k@60 8-bit 14.93Gbps bandwidth
4k@60 10-bit 17.92Gbps bandwidth (Do you get the feeling now?)
4k@60 12-bit 20.9Gbps bandwidth (The new ASUS Dolby Vision Monitors: The ProArt PQ22UC & ProArt PA32UCX)
4k@120 8-bit 29.86Gbps bandwidth
4k@144 8-bit 35.83Gbps bandwidth

NOW CRG9!
5120x1440@100 10-bit 26.54Gbps bandwidth (max refresh is 100Hz at 10-bit 4-4-4 color sub.)
5120x1440@120 10-bit 31.85Gbps bandwidth
5120x1440@120 8-bit 26.54Gbps bandwidth
5120x1440@144 8-bit 31.85Gbps bandwidth (OCed)

For that resolution you DO NEED at least 32GB RAM, so your GPU won't start swapping memory to your SSD at AAA games, and to avoid any stutter.

The RTX3080 is a beast, but not for such a monstrous monitor!
It would max your monitor FPS with the older titles. May not be enough for AAA games. And it MAY NOT be good for the next few years for upcoming titles.
I doubt that even 3090 can max that monitor out.
Why don't you buy the Radeon RX 6900 XT instead?

Couple of links to comparable monitor (4k@144 RTX3080) with similar bandwidth:
 
21 - 40 of 47 Posts