Overclock.net banner

1 - 20 of 341 Posts

·
Registered
Joined
·
3,412 Posts
Discussion Starter · #1 ·




390 and 970 were neck and neck in DX11 before, but for newer games 390 is pulling ahead significantly. Can we expect the same to happen with 480 vs 1060 in a year time (when NV releases new series)? Probably
 
  • Rep+
Reactions: LAKEINTEL

·
Premium Member
Joined
·
2,373 Posts

·
Gamer
Joined
·
3,625 Posts
Just like GTX1060 vs RX480. Probably never going to buy Nvidia again, I took a gamble and totally regret it.
When I first bought this card it was doing better than the RX480 8GB in DX11 titles..Now even the GTX 1060 6gb loses most of the time.
Nvidia's drivers have been trash lately as well. Smh, should have stuck to AMD.
 

·
Simpleton
Joined
·
1,895 Posts
AMD FineWine ™
thumb.gif
 

·
Waiting for 7nm EUV
Joined
·
11,527 Posts
Quote:
Originally Posted by GamerusMaximus View Post

It's a shame AMD can never get this performance out at launch, when everyone is reviewing their hardware and comparing it to nvidia.
The Hardware Canucks article linked above is from December 5th, that gives time for the round of Holiday season purchases.
 

·
Premium Member
Joined
·
6,181 Posts
I love GCN GPUs, AMD has supported them incredibly well and they just keep on improving.

Just so everyone knows, the clock-speed the GTX 970 was running at was 1290MHz, and the R9 390 was running at 1000MHz, there's a decent bit of overclocking headroom left for both cards, especially the GTX 970.
Quote:
Originally Posted by scorch062 View Post

Inclusion of frame times, 1%/0.1 lows or, the very least, fps minimums would give a bigger picture here.
Yeah minimums are very important, the averages are quite promising though.
 

·
Registered
Joined
·
384 Posts
Quote:
Originally Posted by tpi2007 View Post

The Hardware Canucks article linked above is from December 5th, that gives time for the round of Holiday season purchases.
And the majority of 480 reviews were done months ago. The sites doing them now dont have anywhere the level of useful information that the launch reviews did.

And the launch reviews showed a negative impression. Especially when the 1060 reviews came out, and everybody said the 1060 was faster. Heck, people on this forum still say the 1060 is faster, despite the new information.

Amd FineWine is great and all, but the first impressions from their early drivers are still there, negatively impacting them.

Hopefully now that their drivers are good, they will have high quality performance out of the gate with vega for once.
 

·
Waiting for 7nm EUV
Joined
·
11,527 Posts
Quote:
Originally Posted by GamerusMaximus View Post

And the majority of 480 reviews were done months ago. The sites doing them now dont have anywhere the level of useful information that the launch reviews did.

And the launch reviews showed a negative impression. Especially when the 1060 reviews came out, and everybody said the 1060 was faster. Heck, people on this forum still say the 1060 is faster, despite the new information.

Amd FineWine is great and all, but the first impressions from their early drivers are still there, negatively impacting them.

Hopefully now that their drivers are good, they will have high quality performance out of the gate with vega for once.
If people read those reviews back then they probably also read the part that talks about AMD drivers getting better with time, so I'd expect that they'd search for updated reviews when buying for the Holiday season.

Anyway, seems like Vega is on track:

http://www.universityherald.com/amp/articles/56607/20161222/amd-vega-10-outscores-nvidia-geforce-gtx-1080-ti-at-rra-gpu-certification-teases-launch-at-gdc-2017-video.htm
Quote:
AMD Vega 10 is the star during RRA GPU certification with its solid performance outscoring its expected rival, NVIDIA GeForce GTX 1080 Ti, which underwent the same regulatory evaluation. Reports tell that AMD Vega 10 did very well in all aspect.

RRA is South Korea's National Radio Research Agency, a regulatory board that conducts tests and approves products that are produced by Silicon Valley-based companies before they are sold into the electronic market, Segment Next noted.
 

·
Registered
Joined
·
3,412 Posts
Discussion Starter · #12 ·
Quote:
Originally Posted by eTheBlack View Post

Is this just because of AMD better drivers after time, or do Nvidia purposely make card worse after time OR don't actually properly test newer drivers with older GPUs?
What I heard/read from several places, AMD made their cards so they don't have to optimize too much in drivers. NV on the other hand have to optimize more. And when new series comes out NV leaves old architecture behind, giving it less attention.
 
  • Rep+
Reactions: LAKEINTEL

·
Registered
Joined
·
3,240 Posts
From what I have observed of Nvidia drivers, unless there is a major issue with a particular game, they dont rework the drivers for games like AMD does to eek out more performance... Either they think they optimized to the max their drivers or is a way to ensure obsolescence for their cards so everyone will buy the latest and greatest card rather than holding onto their old GPU...
 

·
Registered
Joined
·
384 Posts
Quote:
Originally Posted by eTheBlack View Post

Is this just because of AMD better drivers after time, or do Nvidia purposely make card worse after time OR don't actually properly test newer drivers with older GPUs?
The first one.

The rumor that nvidia harms previous generations of cards, "gimping" them, has been proven false by hardwarecanucks. Nvidia simply doesnt optimize their drivers for anything other then their current generation of GPUs. This sued to be the norm in the GPU industry.

Nvidia cards also perform at near 100% on day 1. By the end of the first month, optimization of their architecture is pretty much complete.

AMD, OTOH, needs months to properly optimize, largely due to their smaller driver teams and more complicated uArch. AMD typically has more powerful GPUs in a segment as a result.

Today, however, AMD has bucked the trend in two ways. The first is the age of GCN. They have tweaked the same arch since late 2011, in a very intel like fashion. This has resulted in them spending years eeking out every last drop of GCN performance, which, when combined with nvidia not optimizing for older uArches, leads to AMD products aging much better then nvidia ones. They also still optimize for all GCN products, including their original ones. This is not normal in the GPU world. This was originally due to them using GCN 1.0 parts over and over again (pitcarn was in three different product generations) but even with GCN 1.4, it seems AMD is still optimizing for GCN as a whole.
 

·
Banned
Joined
·
4,190 Posts
Quote:
Originally Posted by GamerusMaximus View Post

The first one.

The rumor that nvidia harms previous generations of cards, "gimping" them, has been proven false by hardwarecanucks. Nvidia simply doesnt optimize their drivers for anything other then their current generation of GPUs. This sued to be the norm in the GPU industry.

Nvidia cards also perform at near 100% on day 1. By the end of the first month, optimization of their architecture is pretty much complete.

AMD, OTOH, needs months to properly optimize, largely due to their smaller driver teams and more complicated uArch. AMD typically has more powerful GPUs in a segment as a result.

Today, however, AMD has bucked the trend in two ways. The first is the age of GCN. They have tweaked the same arch since late 2011, in a very intel like fashion. This has resulted in them spending years eeking out every last drop of GCN performance, which, when combined with nvidia not optimizing for older uArches, leads to AMD products aging much better then nvidia ones. They also still optimize for all GCN products, including their original ones. This is not normal in the GPU world. This was originally due to them using GCN 1.0 parts over and over again (pitcarn was in three different product generations) but even with GCN 1.4, it seems AMD is still optimizing for GCN as a whole.
Did you just make all of this up?

What did Nvidia change about their architecture since Fermi other than remove components to reduce power consumption?
 

·
Waiting for 7nm EUV
Joined
·
11,527 Posts
Quote:
Originally Posted by GamerusMaximus View Post

Nvidia cards also perform at near 100% on day 1. By the end of the first month, optimization of their architecture is pretty much complete.
That's not really true.

Remember the DX 11 performance driver?

 

·
Registered
Joined
·
384 Posts
Quote:
Originally Posted by budgetgamer120 View Post

Did you just make all of this up?

What did Nvidia change about their architecture since Fermi other than remove components to reduce power consumption?
Well, first of all, I didnt say that nvidia changed their uarch, I said they dont OPTIMIZE for their older uarch, which is completely true. they no longer optimize for kepler, fermi, or maxwell, only pascal. Amd AMD did, certianly, use pitcarn in multiple GPUs (7870, 270, 370)

As I said earlier in the comment, hardarecannucks already tested the nvidia gimping rumor as well. Performance in older games was steady, but in newer games maxwell pulled ahead, suggesting that nvidia was only optimizing their drivers for maxwell. Earlier games came out before maxwell, hence why the older cards were not affected.

And if you would like a list, here is what they changed for maxwell vs kepler, they made modifications to the ROPs, TMUs, and overall core design
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/2

They certainly did a lot more then "take stuff out to reduce power consumption"

Kepler was much closer to fermi, although it still had significant changes

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/2

They are not huge alterations, but they are enough to make each generation different from the last. Maxwell and kepler were quite different beasts, despite one being a heavily tweaked version of the other. And nvidia doesnt seem to optimize for any of them other then the current one.
 

·
Banned
Joined
·
4,190 Posts
Quote:
Originally Posted by GamerusMaximus View Post

Well, first of all, I didnt say that nvidia changed their uarch, I said they dont OPTIMIZE for their older uarch, which is completely true. they no longer optimize for kepler, fermi, or maxwell, only pascal. Amd AMD did, certianly, use pitcarn in multiple GPUs (7870, 270, 370)

As I said earlier in the comment, hardarecannucks already tested the nvidia gimping rumor as well. Performance in older games was steady, but in newer games maxwell pulled ahead, suggesting that nvidia was only optimizing their drivers for maxwell. Earlier games came out before maxwell, hence why the older cards were not affected.

And if you would like a list, here is what they changed for maxwell vs kepler, they made modifications to the ROPs, TMUs, and overall core design
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/2

They certainly did a lot more then "take stuff out to reduce power consumption"

Kepler was much closer to fermi, although it still had significant changes

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/2
The list that Nvidia changed is almost exactly what AMD has been changing on GCN over the years. So that's why I do not understand your explanation. GTX 980ti is to GTX 480 in the same way the Rx Fury is to the 7970. Same architecture baseline with tweaks.

So Nvidia cards should be experiencing the same bumps in performance if I am supposed to go by your explanation as to why Nvidia cards do not get an improvement.

Also not optimizing so one looks better than the other is gimping or planned obsolescence. Not illegal but not good for the consumers.
 
1 - 20 of 341 Posts
Top