Overclock.net - An Overclocking Community - Reply to Topic

Thread: CPU related issue or GPU. Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
10-29-2019 05:54 AM
cssorkinman
Quote: Originally Posted by Alastair View Post
I dont. The best I can do is BF4 when it comes to BF. Crysis 3 as well. I am not playing very new titles. I wait for them to go down in price a year or two after launch. I would say maybe it MIGHT have something to do with AMD's DX12 implementation?

Oh I should have labeled those more precisely, the numbers shown were fpr DX 11 - apologies.

I don't have the Fury any more so I can't generate any DX 12 numbers for it, I'll give it a go on the 2080.

I'll have to get DX 11 numbers for the 2600/570 combo.

I did a similar project with FX/780ti in BF4 - not sure I have the shadowplay of the benchmarking route I used but I think I should have the spreadsheet with the numbers - just gotta find it....lol
10-29-2019 04:33 AM
Alastair
Quote: Originally Posted by cssorkinman View Post
That's good to know.

Do you happen to own BF1?

EDIT: Revisiting the BF1 presets on 1080 DX11 Same route shown here https://youtu.be/YfhL08s4IMw Notice that the Fury is kicking the 2080's butt ( at low settings)?
I dont. The best I can do is BF4 when it comes to BF. Crysis 3 as well. I am not playing very new titles. I wait for them to go down in price a year or two after launch. I would say maybe it MIGHT have something to do with AMD's DX12 implementation?
10-28-2019 04:04 PM
cssorkinman
Quote: Originally Posted by Alastair View Post
Don't stress. Now that I have a new PSU we can consider this tread solved. But I love the conversation. Currently playing ark on EPIC settings. 1080P. And the Vega is pegged right at 100%. But I know this game is really poorly optimized.
That's good to know.

Do you happen to own BF1?

EDIT: Revisiting the BF1 presets on 1080 DX11 Same route shown here
Notice that the Fury is kicking the 2080's butt ( at low settings)?
10-28-2019 03:08 PM
Alastair
Quote: Originally Posted by cssorkinman View Post
Not long ago games didn't look very good at less than the best graphics settings at a given resolution. They've improved( generally) to the point where it's not as important to max out the settings to have a really good visual experience . In some instances going from high to ultra settings the frame rate penalty is significant with the change in quality being almost undetectable. IF it's worth it is a question answerable only by each individual user.

The last time I did some comparisons * before exploit mitigation within the windows OS* the frame rate gain on the AMD cards ( and my 780ti) by dialing back graphics settings just a tiny bit from max was disproportionately large compared to the effect on the RTX 2080 at 1080p.

I mainly used BF1 single player to minimize any cpu limits and maximize load on the graphics card.

I'll revisit that if I get time - maybe youtube the results.


Sorry OP for the off topic replies
Don't stress. Now that I have a new PSU we can consider this tread solved. But I love the conversation. Currently playing ark on EPIC settings. 1080P. And the Vega is pegged right at 100%. But I know this game is really poorly optimized.
10-28-2019 06:20 AM
cssorkinman
Quote: Originally Posted by BroadPwns View Post
1080p but how many Hz, 60? Even with V56 you have a lot of titles where you can't get close to 60FPS with highest settings.

Not long ago games didn't look very good at less than the best graphics settings at a given resolution. They've improved( generally) to the point where it's not as important to max out the settings to have a really good visual experience . In some instances going from high to ultra settings the frame rate penalty is significant with the change in quality being almost undetectable. IF it's worth it is a question answerable only by each individual user.

The last time I did some comparisons * before exploit mitigation within the windows OS* the frame rate gain on the AMD cards ( and my 780ti) by dialing back graphics settings just a tiny bit from max was disproportionately large compared to the effect on the RTX 2080 at 1080p.

I mainly used BF1 single player to minimize any cpu limits and maximize load on the graphics card.

I'll revisit that if I get time - maybe youtube the results.


Sorry OP for the off topic replies
10-28-2019 04:35 AM
BroadPwns 1080p but how many Hz, 60? Even with V56 you have a lot of titles where you can't get close to 60FPS with highest settings.
10-27-2019 07:34 AM
Redwoodz
Quote: Originally Posted by cssorkinman View Post
At 1080p anything above an RX 570 isn't really necessary or a good value proposition.

I have a 2600/570 rig and the RTX 2080 is in the 1800x - both having 1/2 of the same 32 gb set of ram - should do a comparison of them sometime
I've got a Thuban 1100T/MSI890FX/RX470 combo, FX8320E/FX6300/970AuroPro/RX 570 combo and a R5 2600/X470Taichi/RX 580 combo. Would be interesting.
10-27-2019 07:00 AM
cssorkinman At 1080p anything above an RX 570 isn't really necessary or a good value proposition.

I have a 2600/570 rig and the RTX 2080 is in the 1800x - both having 1/2 of the same 32 gb set of ram - should do a comparison of them sometime
10-27-2019 02:29 AM
Alastair
Quote: Originally Posted by ShrimpBrime View Post
Miss my old FX rigs. Never had any issues gaming really. Almost all games where pretty smooth, back then using 7 series GPUs like a Titan and GTX 770, HOF cards ect ect... All gamed very well, cpu usage was only as demanding as the game required any way.
Now-a-Days a lot of games use FX as a minimum spec like CODMW and CODBO. It may have more to do with core count vs Cpu/Gpu bottle necking or per core IPC.
I have no doubt at stock it will bottleneck but I think 4.8 and above with a decent memory speed of 1866 or more and a 2600NB and it should still manage it games with everything up to a single V64 or 1080. Anything more than that I think is asking too much.
10-26-2019 09:58 PM
ShrimpBrime
Quote: Originally Posted by cssorkinman View Post
You can demonstrate a cpu bottleneck on any cpu made. I had a cpu bottleneck in world of tanks with a 4790k pushing the ancient 7970 *though it's probably more accurate to call it a software bottleneck*.
However ,it's not nearly as pervasive as you might think - For example in BF1 multiplayer 64 player maps a 4.9 FX with the Fury would average 130 fps ( 90's min)in 64 player maps 1080p low settings where the overclocked 4790K would be around 155 ( 100 min) and the overclocked 1800X would be around 180 with (120 min) Turn off H/T on the 4790k and it would fall flat on it's butt with minimums in the mid 40s. I experimented with downclocking my FX to the point where it's minimums were that low... it was at 2.4 ghz.

Believe it or not , the RTX 2080 bottlenecks ( gpu usage above 95% with cpu usage in the 60's on crysis 3) the FX a surprising amount of the time in BF1 and Crysis 3 even at 1080p ( albeit at highest settings on C3 and settings above high in the BF series).

There is something strange about the way the 10 and 20 series Nvidia cards work with the FX series - I've managed to have better performance with my 780ti in some cases (when both are at lower graphics settings). The kicker is that cpu usage was lower with the RTX 2080 - that's a head scratcher.
Miss my old FX rigs. Never had any issues gaming really. Almost all games where pretty smooth, back then using 7 series GPUs like a Titan and GTX 770, HOF cards ect ect... All gamed very well, cpu usage was only as demanding as the game required any way.
Now-a-Days a lot of games use FX as a minimum spec like CODMW and CODBO. It may have more to do with core count vs Cpu/Gpu bottle necking or per core IPC.
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off