Originally Posted by parityboy
I also saw that HU video and to me it appears that while people are still saying "drivers need to mature" I don't think drivers will make much difference. Drivers simply translate from common APIs into hardware-specific instructions. Hardware is hardware, it's not going to change. To my eyes it's painfully obvious that the majority of the games themselves simply favour vendor-specific (read: NVIDIA) features/extensions; they may as well say, "for best results install CUDA". It could
be that many of these APIs have "options" parameters which can be used to pass in code that's vendor-specific, but I'm not a graphics developer so I don't know.
For things to change, clearly AMD/RTG need to engage more closely with game and/or game-engine developers in order to give themselves a fair chance. Ironically, performance relative to the RTX 2080 is far closer on with the open source Linux driver than with the closed-source Windows driver.
Having said what I've said, the Phoronix
results tell a slightly different story. The OpenGL renderer seems to be more mature and performant than the Vulkan one overall, but one specific title (Rise of the Tomb Raider) seems to run much better on NVIDIA hardware no matter what.
Over the past few months I have seen countless comments along the lines of 'GCN needs to die it can't compete with Nvidia's' but after this launch and looking at how the Radeon 7 stacks up, it's pretty clear to me GCN is not the problem. BF5 and Prey, games that have been focused on by both driver teams shows the Radeon 7 is nearly 15% faster (in DX12 of course). AMD and GCN's problem is a mix of software-side things:
- Games targeting Nvidia hardware as well as Nvidia propriety features as you say
- Poor drivers in smaller/older titles (this is an issue due to their resource-stretched driver team)
- Reviewers benching majority of titles in DX11, which compounds the above issue
They could change their architecture completely, but unless they reverse engineer Turing and its successor to create a mirror image arch of their own, these problems will remain just as they are with GCN. GCN is compute focused but it wouldn't make sense to abandon that when the coming PS5 has an APU with heavy compute customization (speculation, but it has Navi-based GPU cores).
Originally Posted by Majin SSJ Eric
As I said, this is simply one example of the many varied reasons why I don't put much stock in day-one tech reviews and prefer to make my own determinations about total performance of any hardware by looking at the results end-users here on OCN are able to achieve after launch (and this goes for any brand). For example, there are simply clear biases among the media tech reviewers (some of which are absolutely glaring) which are well documented, they often are using pre-release drivers which are almost immediately fixed in the release drivers oftentimes, they are usually rushed in order to get their results out first for all of those tasty and delicious clicks, and they often do little if anything to optimize their performance results beyond simply plugging the card into the slot and running the benches they always run (many of which are well outdated and somewhat irrelevant).
Hardware Unboxed is (IMO) one of the more anti-AMD tech channels out there on YT and even their numbers put the RVII only 7% behind the 2080 in overall average performance and of course that is with no attempts whatsoever to find any hidden performance from simple and well-known methods such as undervolting etc.
So there you go, another MASSIVE victory by Almighty Nvidia, who's "Revolutionary" RTX 2080 manages to utterly ANNIHILATE the sad-sack Radeon VII by a whole SEVEN PERCENT!!!! In all seriousness, I still maintain that the 2080 probably is the better card for most gamers right now, but its not like you'd have to be a total idiot to buy a Radeon VII since the performance and feature deltas between the two cards is not very significant at all (basically if the 2080 is averaging 60 FPS in a game the "Lowly" RVII will be struggling all the way down there at an utterly peasant-like 56 FPS
The devil is in the detail often, which gets lost with the react first think later culture we have today. So Radeon 7 is 7% or 15% slower than the 2080 and that's that apparently. That's the headline and then the typical fanboy toxic reaction 'AMD have failed again', 'this card is useless', '3 years late' etc etc. Any questioning of these figures are shot down as desperate or making excuses!
But I like the details and specifics - what resolution was used to get to that percent? What games? It turns out Hardware Unboxed is using Warframe, World of Tanks, World of Tanks HD and other games that have absolutely horrible performance on AMD cards. That's worth considering. So it means this isn't a terrible card at all - it's faster or neck and neck with a 2080 in new DX12 titles, consumes slightly more power and is a bit noisey, but comes with 3 free games.