If they say nothing, you scream "coverup".
They should not have advertised it as a 4gb card, simple as that. If i knew ahead of time the card was a 3.5gb, my purchasing decision would have been far different. Saying that they wanted the card to have more power by appearing to be 4gb and "thinking of the gamers" is just a flat out lie, they made it seem like a slightly slower 980 with the same memory size but it simply isn't true.
To be fair in the past they were upfront about this, see 660 Ti. Nobody had a clue this is how the GTX 970 was set up, lets just call this what it is and move on.
except when it is used and starts stuttering like crazyGTX 970 is a 4GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth. This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment.
so in short it's not the arrow it's the indian.We won't let this happen again. We'll do a better job next time.
Except the card DOES have 4GB and you can FULLY USE ALL 4GB. You saying that it only has 3.5GB just shows that you have no idea what you are talking about.Originally Posted by clerick
They should not have advertised it as a 4gb card, simple as that. If i knew ahead of time the card was a 3.5gb, my purchasing decision would have been far different. Saying that they wanted the card to have more power by appearing to be 4gb and "thinking of the gamers" is just a flat out lie, they made it seem like a slightly slower 980 with the same memory size but it simply isn't true.
" new feature" - we got caught, quick spin it spin it spin it!
That may not be the answer anyone wants - consumers, gamers, NVIDIA, etc. - but it actually melds with where I thought this whole process would fall. Others in the media that I know and trust, including HardwareCanucks and Guru3D, have shown similar benchmarks and come to similar conclusions. Is it possible that the 3.5GB/0.5GB memory pools are causing issues with games today at very specific settings and resolutions? Yes. Is it possible that it might do so for more games in the future? Yes. Do I think it is likely that most gamers will come across those cases? I honestly do not.
If you are an owner of a GTX 970, I totally understand the feelings of betrayal, but realistically I don't see many people with access to a wide array of different GPU hardware changing their opinion on the product itself. NVIDIA definitely screwed up with the release of the GeForce GTX 970 - good communication is important for any relationship, including producer to consumer. However, they just might have built a product that can withstand a PR disaster like this.
See? they just failed to mention the new feature in the 970.
That means absolutely nothing. Once the card goes above 3.5gb, it starts to stutter, average framerate is not a proper judge of that.Originally Posted by 47 Knucklehead
Except the card DOES have 4GB and you can FULLY USE ALL 4GB. You saying that it only has 3.5GB just shows that you have no idea what you are talking about.
Thus their advertising that it has 4GB is correct.
The issue is the last 0.5GB is slower and takes a performance hit IF you use it ... which most games don't.
But that has been discussed ad nauseum on the other thread.
http://www.overclock.net/t/1542468/pcw-nvidia-hit-with-false-advertising-suit-over-gtx-970-performance/700_50#post_23586764
yes games do require more vram but saying the last 512mb holding data that is less accessed is double speak.This is understandable. But, let me be clear: Our only intention was to create the best GPU for you. We wanted GTX 970 to have 4GB of memory, as games are using more memory than ever
That's why you actually go to the link I provided and read the article, where it talks about stutter and other metrics.
Looking at the frame time variance, a measure of potential stutter, there is no denying that the data indicates the GTX 970 exhibits more. From the 1.30x scaling testing on up to 1.50x scaling, the GTX 970 shows as much as 5ms frame variance for 10% of rendered frames. Testing on the GTX 980 indicates lower than 5ms frame variance on even the 1.40x scaled result for the last 10% of frames.
The visual of the frame variance graphs might be the most telling - notice that the GTX 970 graph clearly has the 1.10x and 1.20x performance results bunched together hugging the lower spectrum of the variance axis. But starting with 1.30x, the results separate off. Looking at the GTX 980 graph, that doesn't occur to the same level until the 1.50x result.
Clearly there is a frame time variance difference between the GTX 970 and GTX 980. How much of that is attributed to the memory pool difference compared to how much is attributed to the SMM / CUDA core difference is debatable but it leaves the debate open.
How are they spinning it? He's directly addressing the issue.
I have read those metrics and they are worthless to me, pcper is an nvidia shill, it is beyond obvious at this point. My 970 stutters in some games on highest settings, where a friends 980 with the same settings does not (with roughly the same average framerate).
They are spinning by trying to claim that the card does have 4gb as advertised, when clearly it does not and have huge problems. Saying "oh well it''s an extra portion that helps out at 3.5" is a load of garbage, since the 980 has a fully working 4gb and the same specs on their page via rops/speed.
That's illogical.Originally Posted by clerick
They are spinning by trying to claim that the card does have 4gb as advertised, when clearly it does not and have huge problems. Saying "oh well it''s an extra portion that helps out at 3.5" is a load of garbage, since the 980 has a fully working 4gb and the same specs on their page via rops/speed.
Yup, it's a PR disaster and no amount of explaining things to people and showing technical read outs, will change their minds.
Quote:
Seems pretty straight forward with a proper "We messed up, it wont happen again" Apology with an explanation. I don't see any spinning...
No.
In the prior generation of Kepler-derived GPUs, Alben explained, any chips with faulty portions of L2 cache would need to have an entire memory partition disabled. For example, the GeForce GTX 660 Ti is based on a GK104 chip with several SMs and an entire memory partition inactive, so it has an aggregate 192-bit connection to memory, down 64 bits from the full chip's capabilities.
Nvidia's engineers built a new feature into Maxwell that allows the company to make fuller use of a less-than-perfect chip. In the event that a memory partition has a bad section of L2 cache, the firm can disable the bad section of cache. The remaining L2 cache in the memory partition can then service both memory controllers in the partition thanks to a "buddy interface" between the L2 and the memory controllers. That "buddy interface" is shown as active, in a dark, horizontal arrow, in the bottom right memory partition on the diagram. In the other three memory partitions, this arrow is grayed out because the "buddy" interface is not used.
Thanks to this provision, Nvidia is able to equip the GeForce GTX 970 with a full 256-bit memory interface and still ship it at an attractive price in high volumes.
I already understand all that....Originally Posted by 47 Knucklehead
No.
It is basically a method that is in the new Maxwell architecture that allows them to get the full 256-bit pipe on memory yields that don't fully pan out during manufacturing, thus avoiding having to shut down the whole 1GB of memory or reduced it to 192-bit pipe.
This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment. - See more at: http://blogs.nvidia.com/blog/2015/02/24/gtx-970/#sthash.lP94FRSP.dpuf
It is not a mistake so there is nothing to fix. The GTX970 was obviously designed this way.Originally Posted by ozlay
they probably knew it was a fail thats probably why they made the price so low instead of fixing there mistake so everyone that got one to run there multi screen or 4k setups on the cheap got screwed over thats the only thing that really bothers me because nvidia isn't known for cheap cards and to make the 970 so cheap at launch sound fishy to me