Overclock.net - An Overclocking Community - Reply to Topic
Thread: [Official] AMD Radeon VII Owner's Club Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
07-07-2020 05:01 PM
thomasck
Quote: Originally Posted by MSIMAX View Post
im getting stuttering and game freezing ill try to revert back after work
That was after installing a more recent driver like 19.11.3+? I faced that, seemed like the driver was pulling more from the hardware (e.g. being more efficient) and my system was no stable anymore. Turned out RAM needed some more voltage. I spoke about it few pages ago, I think it's worthy reading.

Good luck
07-07-2020 07:32 AM
MSIMAX
Quote: Originally Posted by Dasa View Post
In game AA doesn't affect performance in The Witcher noticeably it is just a post process blur effect which at 3440x1440 34" does lessen the noise on trees and such but foreground objects do lose some detail, it is a balancing act between sharpness filter and AA.
Hair works AA does have a performance impact with 0x=80FPS 4x=78FPS 8x=76FPS.


For me (The Windows 10 May 2020 Update is on its way. Once it’s ready for your device, you’ll see the update available on this page.)
I don't see any need for me to use the new Beta but if you want me to test how the latest WHQL update is performing before the windows update an after if it allows in the not to distant future.

Edit:
Seems to perform much the same as it use to with WHQL and pre win10 2004 ~80FPS at 3440x1440 with tweaked settings like medium fog\reflections with most other settings maxed.
im getting stuttering and game freezing ill try to revert back after work
07-06-2020 04:38 PM
Dasa
Quote: Originally Posted by Offler View Post
Turn off AA completely on 4k.
In game AA doesn't affect performance in The Witcher noticeably it is just a post process blur effect which at 3440x1440 34" does lessen the noise on trees and such but foreground objects do lose some detail, it is a balancing act between sharpness filter and AA.
Hair works AA does have a performance impact with 0x=80FPS 4x=78FPS 8x=76FPS.

Quote: Originally Posted by MSIMAX View Post
anyone having poor performance with division 2 recent drivers and 2004 win update.
For me (The Windows 10 May 2020 Update is on its way. Once it’s ready for your device, you’ll see the update available on this page.)
I don't see any need for me to use the new Beta but if you want me to test how the latest WHQL update is performing before the windows update an after if it allows in the not to distant future.

Edit:
Seems to perform much the same as it use to with WHQL and pre win10 2004 ~80FPS at 3440x1440 with tweaked settings like medium fog\reflections with most other settings maxed.
07-06-2020 10:09 AM
MSIMAX anyone having poor performance with division 2 recent drivers and 2004 win update.

also all my known good overclocks are gone going back to 19.11.1 must be the windows update idk
07-01-2020 01:59 AM
Offler
Quote: Originally Posted by ZealotKi11er View Post
Man, I can't believe how much I have been missing not using the Radeon 7. Just tried Witcher 3 @ 4K max settings and was getting 57 fps at stock, 69 fps 2150/1200.
At these clk speeds it basically on par with my 2080 Ti FE at stock. 1200MHz did make a difference here. I got about 3-4 fps before adding core OC.
Turn off AA completely on 4k.
1. Check if you see any difference in image quality.
2. What are the FPS then?
3. What version of the driver, how the game performs/stutters in regard to shader settings?

Good to know that it can compete with 2080 TI...
06-30-2020 07:44 PM
ZealotKi11er Man, I can't believe how much I have been missing not using the Radeon 7. Just tried Witcher 3 @ 4K max settings and was getting 57 fps at stock, 69 fps 2150/1200.
At these clk speeds it basically on par with my 2080 Ti FE at stock. 1200MHz did make a difference here. I got about 3-4 fps before adding core OC.
06-30-2020 04:30 AM
Offler
Quote: Originally Posted by WannaBeOCer View Post
That doesn't make sense, lower resolutions are running at higher FPS because they're less demanding. I just tested Dynamic Super Resolution on my Titan and games that would only use 280-300w are pegged at 320w with 4k set and starts to throttle the GPU frequency.

This is the only testing I could find online:

https://www.youtube.com/watch?v=1z7h_ObUh-0
Quote: Originally Posted by Dasa View Post
I always assumed that 100% GPU usage meant just that and that increasing res would increase load on different parts of the GPU like memory bus which may make it draw more power but other parts of the GPU may draw less which is why we get coil whine at higher FPS so it may come down to game and card differences.
But it seems you are correct and I was basing that comment of my System power use increasing at lower res due to increased CPU load at least until the point where the CPU becomes the bottlneck then power use stats to decrease again as GPU usage drops.

I just did a quick test with Firestrike

1080P Loop test 1 max settings
~350w average 377.8w peak from wall
GPU 207 average 238w peak

4k
~335w average 373w peak from wall
GPU 221w average 268w peak

GPU was at stock +77% powertune.
Power usage recorded either with a physical wattmeter or with a software is for me the way how to measure real utilization. And temperature of course.

For some time now, GPU utilization on 100% does not neccessarilly mean maximum possible utilization - in some cases i found GPU to be stalled on low frequency due a bug.

Also higher res again does not mean higher utilization. In some cases the GPU might really run on 100% and maxed out power draw on 1080p (without frame cap), and when you increase the resolution, FPS will decrease, GPU utilization and power draw remains the same, but CPU utilization will decrease. Total power draw of the system may go down by 100 watts even...

Thats why i prefer to tune the game performance precisely to the used display, using Vsync, Frame cap and native resolution. Watts go way down, but it require a lot of tuning if you want to remove stutters as well.
06-29-2020 04:28 PM
SoloCamo
Quote: Originally Posted by ZealotKi11er View Post
Anthem at 4K. I was getting 42 fps stock 1800MHz, 51 fps 2150MHz. I applied 1200MHz for memory and still 51-52 fps.
What if you do stock clock + 1200mhz HBM2?

When I noted 5-10fps, it was usually in games where I'm at or near 100fps as is (BFV 4k ultra, many areas in 64 player maps)
06-29-2020 09:09 AM
WannaBeOCer
Quote: Originally Posted by thomasck View Post
That's the thing, if the card can use less power means is not delivering all it could? I've run some FF 4K and the power draw sticks to 244W almost all the time, even if 300W is set in MPT. This kinda answers everything cause of the higher resolution but not really when I think about some games at max 210W.
No clue what AMD did with their drivers starting with 19.7.1 regarding power usage but the card only hits 300w when the card is around 1190mV+ ever since that driver release.

If you undervolted you surely won't see 300w usage.
06-29-2020 06:40 AM
thomasck
Quote: Originally Posted by SoloCamo View Post
16gb card... but no max textures? Legit curious. Also, what driver? Power consumption seems low for that voltage and clock. What's it say in the Radeon overlay?
Yeah, can't play COD MW with max textured, otherwise the game crashes here and there. Driver 20.5.1 running with no issues. All data I get from HWiNFO64, I'm gonna play couple of matches to check that.

EDIT

Radeon Overlay says the same about power draw.

Quote: Originally Posted by WannaBeOCer View Post
1440p isn't as stressful as 4k so the card uses less power. I noticed when I used Virtual Super Resolution set to 4k the card used more power.
That's the thing, if the card can use less power means is not delivering all it could? I've run some FF 4K and the power draw sticks to 244W almost all the time, even if 300W is set in MPT. This kinda answers everything cause of the higher resolution but not really when I think about some games at max 210W.

Quote: Originally Posted by ZealotKi11er View Post
Did you check temp? I only hit 250w if I dont hit 110C junction. With 1090mV you are probably hitting junction temp 110C if air-cooled.

Also I was testing 1000 vs 1200 memory and I dont see any fps increase. This was at 4K.
Temperatures are low here cause I'm on water, max JC is around 70-72 with sustained 63-68 all the time. Maybe that's the ceiling power draw for 1950MHz 1090mV.
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off