Overclock.net banner

1 - 20 of 66 Posts

·
Registered
Joined
·
214 Posts
Discussion Starter #1
i7 5820k
2 x Asus Strix 1080s SLI
64gb Crucial DDR 4 2400 DIMM
Asus Rampage V Extreme ATX2011E
Crucial SSD 960 gb M500
Corsair AX1200I Digital ATX PSU

Not only will it work...but would selling my 1080s and buying 1 2080 ti be a big enough jump to justify? Also, would I get the full power of it or is my other hardware too old to bring it out? Like is my cpu, mobo and etc too old for this?
 

·
Registered
Joined
·
142 Posts
i7 5820k
2 x Asus Strix 1080s SLI
64gb Crucial DDR 4 2400 DIMM
Asus Rampage V Extreme ATX2011E
Crucial SSD 960 gb M500
Corsair AX1200I Digital ATX PSU

Not only will it work...but would selling my 1080s and buying 1 2080 ti be a big enough jump to justify? Also, would I get the full power of it or is my other hardware too old to bring it out? Like is my cpu, mobo and etc too old for this?
64Gb's of CPU/GPU chocking 2400mhz drr4 ram.
 

·
Avid Memer
Joined
·
5,953 Posts
Of course it would work. I'm not sure why you think it wouldn't. As for whether or not the performance leap will be enough to justify the purchase, we can't know yet. I'm leaning toward thinking it will, especially if you will be playing games that leverage the RT and Tensor cores. For games that don't leverage ray tracing, it might still be enough of a performance jump. We won't know until we see reviews.
 

·
Da Boss
Joined
·
1,894 Posts
i7 5820k
2 x Asus Strix 1080s SLI
64gb Crucial DDR 4 2400 DIMM
Asus Rampage V Extreme ATX2011E
Crucial SSD 960 gb M500
Corsair AX1200I Digital ATX PSU

Not only will it work...but would selling my 1080s and buying 1 2080 ti be a big enough jump to justify? Also, would I get the full power of it or is my other hardware too old to bring it out? Like is my cpu, mobo and etc too old for this?
I'm going to be blunt. This is a silly question to ask when we have no idea what the performance of the 2080ti is going to be.
 

·
Registered
Joined
·
214 Posts
Discussion Starter #5
64Gb's of CPU/GPU chocking 2400mhz drr4 ram.
What does this mean? Are you saying my RAM isn't fast enough for it?

Of course it would work. I'm not sure why you think it wouldn't. As for whether or not the performance leap will be enough to justify the purchase, we can't know yet. I'm leaning toward thinking it will, especially if you will be playing games that leverage the RT and Tensor cores. For games that don't leverage ray tracing, it might still be enough of a performance jump. We won't know until we see reviews.
I ask because I haven't been keeping up with CPUs and Mobos since i got my machine. Which was 4 years ago already... I know the new cpus won't work on my mobo so I wondered if these new cards may have different slots or only work with new mobos. I know my PSU is overkill but who knows maybe it wouldn't be enough. In the past I have bought video cards for too old hardware and due to bottle necking I wasn't really getting the full benefits. It would have been better to just build a new rig.

Have CPUS and Mobos gotten any faster or are they pretty much the same?

Also, I know reviews aren't in yet, but have you guys seen the numbers...Supposedly on paper this is a huge jump...but could that possibly be all marketing BS?
 

·
Registered
Joined
·
142 Posts
What does this mean? Are you saying my RAM isn't fast enough for it?



The higher end the GPU.. the more it matters... this is with a lowly 1080. Yes some of these are 2133mhz but you'll get the point.

Assassin's Creed Origins RAM 2133 MHz vs. 3000 MHz

The Witcher 3 RAM 2133MHz CL12 vs. 3000MHz CL16

Far Cry 5 2133MHz vs. 3000MHz (RAM Speed Comparison)

The Elder Scrolls Online 2133MHz vs. 3000MHz (RAM Speed Comparison)

8GB RAM 3000MHz vs. 16GB RAM 2133MHz (Test in 7 Games)

Ghost Recon Wildlands RAM 2133 MHz vs. 2400 MHz vs. 2666 MHz vs. 3000 MHz

Rise of the Tomb Raider RAM 2133 MHz vs. 2400 MHz vs. 2666 MHz vs. 3000 MHz
 

·
Registered
Joined
·
214 Posts
Discussion Starter #7
The higher end the GPU.. the more it matters... this is with a lowly 1080. Yes some of these are 2133mhz but you'll get the point.

Well I guess that explains why I get poor AC Origins performance....I mean it's only a 10 frame difference but this is overall why I am asking this question. If I am not even getting the full benefits of my 1080s then I definitely won't with the 2080s when you factor in my 4 year old CPU and MOBO right? So would it be smarter to just wait a year or two and build a new system?
 

·
Registered
Joined
·
142 Posts
The higher end the GPU.. the more it matters... this is with a lowly 1080. Yes some of these are 2133mhz but you'll get the point.

Well I guess that explains why I get poor AC Origins performance....I mean it's only a 10 frame difference but this is overall why I am asking this question. If I am not even getting the full benefits of my 1080s then I definitely won't with the 2080s when you factor in my 4 year old CPU and MOBO right? So would it be smarter to just wait a year or two and build a new system?

AC origins is more than 10 frames.. its 20 or higher in some spots... but yeah, you should wait and upgrade your whole platform so you can get the full benefits of the newer GPU's. Im not trying to be a smart butt or anything, just wanted to help you out and show you something that a lot of people are not aware of.
 

·
Premium Member
Joined
·
3,688 Posts
i7 5820k
2 x Asus Strix 1080s SLI
64gb Crucial DDR 4 2400 DIMM
Asus Rampage V Extreme ATX2011E
Crucial SSD 960 gb M500
Corsair AX1200I Digital ATX PSU

Not only will it work...but would selling my 1080s and buying 1 2080 ti be a big enough jump to justify? Also, would I get the full power of it or is my other hardware too old to bring it out? Like is my cpu, mobo and etc too old for this?
You are running 2 1080's in SLI...This should be more than enough performance to max everything at 4k let along anything lower.

Having said that I have a 1080TI and I can do it, so obviously the 2080TI is going to be able to do the same or Nvidia is going to be handling a TON of returns.
 

·
Registered
Joined
·
214 Posts
Discussion Starter #10
AC origins is more than 10 frames.. its 20 or higher in some spots... but yeah, you should wait and upgrade your whole platform so you can get the full benefits of the newer GPU's. Im not trying to be a smart butt or anything, just wanted to help you out and show you something that a lot of people are not aware of.
Saving me a grand is being a smart butt? Please smart butt away... I do a lot of video work and next rig I will get the best possible processor I can afford since most apps are processor intensive these days. Based on what I have, when would it be worth it to do a full build? Is what I have too old? Would be a huge jump if I build something like a hex core i7, 2080 ti, and 32-64 gigs of 3000 MHz ram. I know bleeding edge is overkill often. Like at on of the places I work they have the old 5k imacs and brand new 20 thousand dollars imac pros and due to software issues the older imacs run better at the moment.
 

·
Registered
Joined
·
214 Posts
Discussion Starter #11
You are running 2 1080's in SLI...This should be more than enough performance to max everything at 4k let along anything lower.

Having said that I have a 1080TI and I can do it, so obviously the 2080TI is going to be able to do the same or Nvidia is going to be handling a TON of returns.
Huh? Max out in 4k??? Which games are you playing? Most games don't even support SLI lately...I rue the day I even bought two video cards...I WISH I bought one and then bought a 2080ti a year later... I have a 3440 x 1440 g sync monitor but also a 4k monitor. In 4k the only modern games I get descent performance are Destiny, Doom, and Wolfenstein. I know it's poorly optimized but in Monster Hunter I get like 20 FPS in 4k on low.
 

·
Registered
Joined
·
142 Posts
Based on what I have, when would it be worth it to do a full build?
Wait until intels and amd's next round of CPU's... we should have this spectre and meltdown crap dealt with on a hardware level instead of software patches and bios updates which rob performance over time. In the meantime if you wanted to see more consistent performance I would sell those 2 1080's (sli honestly isnt worth it and is hit or miss) and grab a nice 2nd hand 1080ti. I have one myself and it handles 4k nicely... Also my MWH performance is around 35fps maxed settings in 4k if that helps you anyway.. but my 1080ti is running at a constant 2050mhz core clock. Im also currently still on a 4790k but at 4.7ghz and I have about the best ddr3 ram I can get which is ddr3 2400mhz. Im waiting for the newer CPU's from intel and amd before I upgrade my platform but for the time being its serving me ok.
 

·
9 Cans of Ravioli
Joined
·
19,788 Posts
15 posts all about linking one YouTube channel blabbing about RAM speeds being more impactful than they are really in reality.

:rolleyes:

Look at any reputable channel like GamersNexus and the difference between 2400 and something like 3600 is less than 5 FPS.

Regardless - OP is asking about swapping two 1080 Tis for a 2080 Ti - he's not asking about his RAM speed.
 

·
Registered
Joined
·
214 Posts
Discussion Starter #14
15 posts all about linking one YouTube channel blabbing about RAM speeds being more impactful than they are really in reality.

:rolleyes:

Look at any reputable channel like GamersNexus and the difference between 2400 and something like 3600 is less than 5 FPS.

Regardless - OP is asking about swapping two 1080 Tis for a 2080 Ti - he's not asking about his RAM speed.
It's 1080s not ti's. Also, it is relevant because I asked if my current hardware will bring out the full power of it. But what no one is answering is what about my processor and mobo. Is there any bottle necking there?
 

·
Overclocker
Joined
·
11,634 Posts
Unless you run Nvidia branded/certified CPU, mobo, ram, PSU, monitor, cables, it won't work. Now I'm just kidding, well unfortunately not 100%, some features from Nvidia really are locked and only supported on their own hardware such as Gsync module hindered overpriced monitors.

Other than that your CPU and RAM speed+timings are to be considered. RAM speed is not that relevant, latencies are.

No one knows what software stack changes, additions Nvidia added to their new GPUs, it is always possible they made things even worse than they are, it's a software and design company after all, what they can they try to offload to CPU for ages. Did they bring back hardware scheduler finally? No one knows I didn't see them mention anything else but try and hype up ray tracing that only a few expensive games/engines will support and only on the proprietary gameworks/hinderworks API for Nvidia RTX GPUs, who knows if they will allow compute units processing of it on other GPUs and you can be sure they will try and block it altogether on competitor products because they control that proprietary API of theirs that tanks performance even of their own GPUs often when enabled/used.
 

·
Registered
Joined
·
142 Posts
But what no one is answering is what about my processor and mobo. Is there any bottle necking there?



Ram frequency plays a large direct impact on cpu performance.... as you can see from the videos I posted "all of those games hammer the cpu". So, its not just a simple answer regarding your cpu.... Lets take a 8700k for example like you see in the videos I posted...There is an obvious cpu bottleneck with ddr4 2133 vs ddr4 3000+mhz ram.. The cpu and ram do not work as 2 separate entities in your computer.. they work together and as such ram frequency impacts cpu performance.
 

·
Registered
Joined
·
142 Posts
RAM speed is not that relevant, latencies are.
Completely incorrect... this is what I am talking about people are still clinging to the way things were a decade ago where ram frequency did not play as large of a role in performance. Things have changed and today ram frequency matters more and increased ram frequency increases memory bandwidth which again.. matters a lot.

The videos I posted compares some of the tightest timings on ddr4 2133 you can get with ddr4 3000 "mediocre timings" and the faster 3000mhz kit walks all over the 2133mhz kit with tighter timings.






2133MHz CL12 vs. 3000MHz CL16 (RAM Test in 10 Games)

The Witcher 3 RAM 2133MHz CL12 vs. 3000MHz CL16
 

·
Registered
Joined
·
214 Posts
Discussion Starter #18
Ram frequency plays a large direct impact on cpu performance.... as you can see from the videos I posted "all of those games hammer the cpu". So, its not just a simple answer regarding your cpu.... Lets take a 8700k for example like you see in the videos I posted...There is an obvious cpu bottleneck with ddr4 2133 vs ddr4 3000+mhz ram.. The cpu and ram do not work as 2 separate entities in your computer.. they work together and as such ram frequency impacts cpu performance.
So you are basically saying getting better ram is answering my question about the since I would need a newer processor that can handle higher ram speeds?

Does this go for ALL other applications or is speed only applicable to gaming? Like would this make Adobe Apps or Maya run better as well?
 

·
Registered
Joined
·
142 Posts
So you are basically saying getting better ram is answering my question about the since I would need a newer processor that can handle higher ram speeds?
I am simply showing you how ram frequency impacts CPU performance and as such can have a pretty large FPS boost when you are using a higher end GPU... I am not sure what the max ram speed is that your CPU and Board can handle. Remember why is it important to have a fast CPU with a high end GPU in the first place? to remove bottlenecks... but what many people do not understand when they buy a nice new shiny 8700k CPU is that pairing it up with ddr4 2133-2400mhz ram holds back a lot of the CPU performance that they would see with faster ram.
 
1 - 20 of 66 Posts
Top