Looks like they don't care about testing on Ryzen, which is disappointing. Maybe in the future?
Quote:
"Ideally I would have liked to include some more tests such as quality scaling results for say the GTX 1060 and RX 480, but I simply ran out of time thanks to Origin constantly locking me out for 24 hours after swapping components. That being the case, I will aim to add the GeForce GTX 700 series and Radeon 200 series later this week followed by some CPU testing."
Do you think review sites will just test new games for Intel and just say that Ryzen is 5% slower and call it a day? Some games will scale differently with different processors.
7700K at 4.9
Do you think review sites will just test new games for Intel and just say that Ryzen is 5% slower and call it a day? Some games will scale differently with different processors.
I'd be more interested to understand whether 8 or 10 physical cores makes any difference. I don't use Hyper Threading I'll have to see what results I get tonight when the game is released.
Pretty poor timing, duuno why they didn't wait for nVidia drivers updated for the game. Looking at the benchmarks Pascal isn't performing where you would expect it to against Maxwell.
Pretty poor timing, duuno why they didn't wait for nVidia drivers updated for the game. Looking at the benchmarks Pascal isn't performing where you would expect it to against Maxwell.
Seems like one of the few newer titles that is loving NVidia cards. Interesting that it is with frostbite as well, usually that engine likes to play with AMD cards more.
Seems like one of the few newer titles that is loving NVidia cards. Interesting that it is with frostbite as well, usually that engine likes to play with AMD cards more.
Pretty poor timing, duuno why they didn't wait for nVidia drivers updated for the game. Looking at the benchmarks Pascal isn't performing where you would expect it to against Maxwell.
"For testing we used the AMD Radeon Crimson Edition Graphics Driver 17.3.2 Hotfix for testing the Radeon GPUs and the Nvidia based GPUs were tested using the GeForce Game Ready Driver 378.78."
I'd be more interested to understand whether 8 or 10 physical cores makes any difference. I don't use Hyper Threading I'll have to see what results I get tonight when the game is released.
HT has been used for games going on several years.
Sorry about your loss! Turn that back on dog, your holding yourself to i5 performance.
Even if you take a small clockspeed hit, it may be worth it to have it running.
Game doesn't see to be any heavy on tessellation as others, but lets pretend it is so some people will have an excuse
After all, god forbid nvidia be better than AMD in a game. The sky will fall. Pigs will fly. And climate change will be found to be the work of scientology.
They've already gotten up to 35% boost on ALL AMD cards by turning down tessellation.....with an unnoticeable visual difference. what do you get out of being so butt hurt about this? It's only the hundredth time this has happened...
Game doesn't see to be any heavy on tessellation as others, but lets pretend it is so some people will have an excuse
After all, god forbid nvidia be better than AMD in a game. The sky will fall. Pigs will fly. And climate change will be found to be the work of scientology.
I would enjoy it very much if this became true. This path would make an excellent final chapter for Part 1 of the Climate Change Saga and also be one hell of a cliffhanger for Part 2.
Game doesn't see to be any heavy on tessellation as others, but lets pretend it is so some people will have an excuse
After all, god forbid nvidia be better than AMD in a game. The sky will fall. Pigs will fly. And climate change will be found to be the work of scientology.
OK, guys in all honesty, there is no circumstance in which a GTX 1060 should be doing better than an R9 390x or an RX 480, there just isn't. AMD drivers are junk/whatever, that is still a scenario that should never happen.
What this tells me is that this game is horribly optimized. It's not hard to believe with all the janky facial animations and robotic movement/lack of physics. I've seen a few streams of this game already, this isn't the same Mass Effect that you remember, it's complete trash.
They've already gotten up to 35% boost on ALL AMD cards by turning down tessellation.....with an unnoticeable visual difference. what do you get out of being so butt hurt about this? It's only the hundredth time this has happened...
Oh I have seen the "35%" claim. How strange it doesn't hold. www.pcgameshardware.de who claimed they got a tessellation fixed driver from AMD to reduce tessellation in expense of image quality (AMD claim, so yeah...), still showed those GPUs a lot less performing than nvidia.
They gained 12% by turning off tessellation compared to the fixed AMD drivers, and 26% compared to default without the fix. Not "35%".
And I wonder how much nvidia gains with removing all tessellation? Also 30-40% I expect? So it is the same performance again?
Yet they still did less FPS than nvidia even after the "AMD fix". They have an AIB 1070 doing loops around the fury X at 3440x1440, the 970 doing almost the same as a 390 at 4K, at the 980 TI (AIB though) doing 12% faster than fury x.
Sorry, 35% boost with turning tessellation but not stating nvidia gain on the same settings, cheap trick to try to prove a point. Try. Not successful yet.
OK, guys in all honesty, there is no circumstance in which a GTX 1060 should be doing better than an R9 390x or an RX 480, there just isn't. AMD drivers are junk/whatever, that is still a scenario that should never happen.
What this tells me is that this game is horribly optimized. It's not hard to believe with all the janky facial animations and robotic movement/lack of physics. I've seen a few streams of this game already, this isn't the same Mass Effect that you remember, it's complete trash.
You can look here for a review with AMD fixed drivers specifically for this game.
While they are using AIB cards, the 1060 is doing better than the fury x at 1080p, it start to match in 1440p, and in 4K while both are unplayable with their settings, the 480 is a bit better.
It could definitely be that the game paths are very nvidia liked, and the game was developed on nvidia hardware and runs better on nvidia hardware. This is the same as games being developed on AMD hardware and always run better on AMD hardware.
Part of the game between the two manufacturers. Each one is playing with its own strengths. If nvidia can do more, they will definitely over use it. If AMD can use more async compute, they will definitely over use it. If the first happens its nvidia's fault and if the second its nvidia's fault? Kinda cheating on perspective isn't it?
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
Ask a question
Ask a question
Overclock.net
27.8M posts
541.2K members
Since 2004
A forum community dedicated to overclocking enthusiasts and testing the limits of computing. Come join the discussion about computing, builds, collections, displays, models, styles, scales, specifications, reviews, accessories, classifieds, and more!