Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016
New Posts  All Forums:Forum Nav:

[TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016 - Page 59

post #581 of 724
Quote:
Originally Posted by caswow View Post

the fact that amd doesnt need dedicated hardware in their FS monitors says otherwise.

...Yes, yes it does. Why else would AMD be so proud of their hardware partnerships?

http://www.amd.com/en-us/press-releases/Pages/support-for-freesync-2014sep18.aspx

Learn the facts, please, if you plan to continue this discussion.
post #582 of 724
Quote:
Originally Posted by Mand12 View Post

The demos were presented as functional, working variable refresh. And they weren't. They were outright fabrications, billed as something not even close to the reality. And it requires new hardware, which AMD said it wouldn't and then later said it would, because they were asking the OEMs to develop said hardware.

The arrival date is not what the lies were about, and them not lying about one thing does not mean they didn't lie about others.

Are you really saying the ends justify the means? That because we got FreeSync in 18 months justifies a marketing campaign like "Free is better than G" when G-Sync had a functional, working product available at retail and AMD had false demos?

In all seriousness, AMD fans are the ones who should be the most upset by the FreeSync debacle. Yes, AMD got caught flat-footed by G-Sync, but instead of saying "Damn, we'll just have to work hard to catch up" they decided to lie about how much progress they had made, and also lie about G-Sync in order to discourage people from early adoption to buy themselves time. That's a sign of a company that is severely under-resourced.
We went through this in the old threads, your only proof of their lies was that there was no fps counter on screen, again these were demos, not final product showcases. Why would anyone be upset about freesync? It's actually one of the rare ocassions AMD delivered on time. You want a lie? Try selling a 224bit bus 3.5+0.5GB card as a 256bit bus 4GB one and then try to deny it at first, then call it a typo clerical error only to then have your CEO explain it away in some halfa**ed apology as a "new feature" as if they would ever missout on an opportunity to sell you on "new features"
post #583 of 724
Quote:
Originally Posted by Mand12 View Post

Did I say that I liked the approach they took with 970? No? Okay then.

Yes, they made a mistake on the spec sheet. No, they shouldn't have made it. Yes, they should have brought it up sooner, but I honestly can't fault them for not nitpicking a canned press item that probably got internally approved (mistakenly) months before publication.

But the furor around 970 is, actually, completely overblown. Had the spec sheet been correct, the reviewers would have said "Huh, that's weird, people haven't done that before in a graphics card. Let's see how it performs. Oh, hey, these benchmarks are great, guess it's no big deal." There is literally zero upside to Nvidia "lying" about 970's memory performance. People should never buy based on spec sheet numbers, particularly obscure ones like that. The translation from spec sheet number to real-world performance is murky for any of them, but it's particularly bad for that number.


So you have a year and a half worth of misleading demos, lies about capability, lies about price, lies about progress, lies about how much they're leaving to monitor OEMs, lies about parity with their competitor, and lies about their competitor's performance. And you want to compare that to a misprint that has zero upside for the company, and was only ever going to blow up in their face? Seriously?

Lying requires intent. AMD proved throughout FreeSync's development that they consider active disinformation a viable marketing tactic. 970 was a mistake, but it was not malicious.

Except we don't live in a vacuum, and alternatives to 970 do exist. As I've said more times than I care to count, the whole thing pissed me off because I didn't buy my 970 on launch day, I waited a good month to make sure I wouldn't be an early beta tester, and so I could have more reviews to read from. Now at the same time, AMD started their firesale on the 290X, and at one point a 290X Lightning could be had for $299. Had I known about the segmented memory BS, I would've been much more inclined to buy the 290X Lightning instead. Or hell, grab a 290 for even cheaper since it's mostly within 5% of the 290X anyway, so functional difference is effectively 0 in actual games. And yes I realize power efficiency is very important to some people; I couldn't care less about it.

Every purchase decision incurs an opportunity cost, it behooves me as the consumer to make the best choice for my money. I did my due diligence to the best of my abilities, yet nVidia hindered my decision making process by hiding a piece of critical information that would've swayed my decision against their favor.
Edited by magnek - 2/29/16 at 9:19am
post #584 of 724
Quote:
Originally Posted by PlugSeven View Post

We went through this in the old threads, your only proof of their lies was that there was no fps counter on screen, again these were demos, not final product showcases. Why would anyone be upset about freesync? It's actually one of the rare ocassions AMD delivered on time. You want a lie? Try selling a 224bit bus 3.5+0.5GB card as a 256bit bus 4GB one and then try to deny it at first, then call it a typo clerical error only to then have your CEO explain it away in some halfa**ed apology as a "new feature" as if they would ever missout on an opportunity to sell you on "new features"

It doesn't matter if it's not a final product showcase, they claimed it was showing variable refresh when it wasn't showing variable refresh.
post #585 of 724
Quote:
Originally Posted by HMoneyGrip View Post

I hope when this thing comes out, it should be able to run AAA titles at the time at 4K with an average 60FPS, max\ultra settings. (one card) I hope that is truly the minimum performance we can expect from whatever Nvidia and AMD come out with later this year.

Hoping 2016 will be the break through year when we can truly enjoy 4K gaming without totally breaking the bank, and 4K television as well. (Ultra Blu Ray players, and hopefully more streaming options from Apple\Netflix\Amazon\Google Play)

Highly doubt that..... Maybe in 2018 or so, Two 980Ti struggle with some maxed out. GTA V is a killer... Crisis 3 & Metro also does two of them in.
 
"MSI GT70"
(15 items)
 
 
CPUGraphicsRAMHard Drive
Intel Core i7 4700MQ NVIDIA Geforce GTX 780M 32GB 1600MHz 128GB SSD Triple Raid 
Hard DriveHard DriveHard DriveOptical Drive
128GB SSD 128GB SSD 1TB  Blueray 
OSMonitor
Windows 8 Professional  17: 1920x1080 
  hide details  
Reply
 
"MSI GT70"
(15 items)
 
 
CPUGraphicsRAMHard Drive
Intel Core i7 4700MQ NVIDIA Geforce GTX 780M 32GB 1600MHz 128GB SSD Triple Raid 
Hard DriveHard DriveHard DriveOptical Drive
128GB SSD 128GB SSD 1TB  Blueray 
OSMonitor
Windows 8 Professional  17: 1920x1080 
  hide details  
Reply
post #586 of 724
Quote:
Originally Posted by Mand12 View Post

It doesn't matter if it's not a final product showcase, they claimed it was showing variable refresh when it wasn't showing variable refresh.
I would ask for proof of the above bolded IF you had any. This is a Pascal thread, so post it in the freesync thread if you got any, we've derailed this enough.
post #587 of 724
Quote:
Originally Posted by PlugSeven View Post

I would ask for proof of the above bolded IF you had any. This is a Pascal thread, so post it in the freesync thread if you got any, we've derailed this enough.

You mentioned the prior threads, it was all there. If you fail to believe what your own eyes see, then there's not much I can do about it.
post #588 of 724
Quote:
Originally Posted by magnek View Post

Except we don't live in a vacuum, and alternatives to 970 do exist. As I've said more times than I care to count, the whole thing pissed me off because I didn't buy my 970 on launch day, I waited a good month to make sure I wouldn't be an early beta tester, and so I could have more reviews to read from. Now at the same time, AMD started their firesale on the 290X, and at one point a 290X Lightning could be had for $299. Had I known about the segmented memory BS, I would've been much more inclined to buy the 290X Lightning instead. Or hell, grab a 290 for even cheaper since it's mostly within 5% of the 290X anyway, so functional difference is effectively 0 in actual games. And yes I realize power efficiency is very important to some people; I couldn't care less about it.

Every purchase decision incurs an opportunity cost, it behooves me as the consumer to make the best choice for my money. I did my due diligence to the best of my abilities, yet nVidia hindered my decision making process by hiding a piece of critical information that would've swayed my decision against their favor.

Since people are making that big of a deal of this 3.5 gb memory issue, would the Gtx 970 would have sold as well as it did if it was made apparent from the beginning?

If it did not sell as well, wouldn't that leave less of an impact on AMD radeon cards sales and thus, 290x/290 pricing less effected? Hawaii cards were $499-550 for the 290x and 290 pricing was holding steady at $399. It wasn't until the launch of the gtx 970/980 series, that AMD sales collapsed and led to the fall of AMD pricing. The 970 sales played a greater role and it took a couple months for AMD to officially respond.

The fantastic rate of sales of the gtx 970 made the fire sale pricing of the Hawaii cards possible. If the issue was as enormous as you are making it out to be, the fire sale would not be possible.

You can't say I would have bought a 290x card instead at $299 if I was informed ahead of time of the memory issue when pricing could very well not be at the level it sunk to.

The value of the 290/290x cards were made possible by the gtx 970 cratering their sales. Take away those sales and you don't see such a price drop in the AMD cards.
post #589 of 724
Quote:
Originally Posted by caswow View Post

the full dx12 slide is funny too thumb.gif what do you say about this?


eh ee eh eh


Corporations, what else.
post #590 of 724
Quote:
Originally Posted by tajoh111 View Post

Since people are making that big of a deal of this 3.5 gb memory issue, would the Gtx 970 would have sold as well as it did if it was made apparent from the beginning?

If it did not sell as well, wouldn't that leave less of an impact on AMD radeon cards sales and thus, 290x/290 pricing less effected? Hawaii cards were $499-550 for the 290x and 290 pricing was holding steady at $399. It wasn't until the launch of the gtx 970/980 series, that AMD sales collapsed and led to the fall of AMD pricing. The 970 sales played a greater role and it took a couple months for AMD to officially respond.

The fantastic rate of sales of the gtx 970 made the fire sale pricing of the Hawaii cards possible. If the issue was as enormous as you are making it out to be, the fire sale would not be possible.

You can't say I would have bought a 290x card instead at $299 if I was informed ahead of time of the memory issue when pricing could very well not be at the level it sunk to.

The value of the 290/290x cards were made possible by the gtx 970 cratering their sales. Take away those sales and you don't see such a price drop in the AMD cards.

You'd have a great point IF the choice was a simple dichotomy (it's not) AND the 980 didn't exist (it did).

If the segmented memory turned out not to be a big deal, then things would've played out more or less as I laid out in my post above, and nothing would've changed.

Things get more interesting if the segmented memory was such a huge deal to the extent that it wouldn't have threatened 290(X) sales. Well that's where the 980 screws things up for AMD. At launch, 980 was 10-25% faster than 290X depending on resolution, and it sold for $550. There's just no way the 290X would've kept its $550 price tag period. But of course the million dollar question now would be: how would AMD have responded in this particular scenario?

I don't know and wouldn't pretend to know, but I'd like to think at a minimum, they'd want to a least match the price/perf of the 980. Since the 980 had a 10% lead even at 4K, that means at least a 10% price cut, so that would bring down 290X's price to $500. And the 290, well again a big question mark. It just traded blows with 970 even at 4K, and was ~10% slower at 1080p. But since we're talking about the hypothetical situation in which the segmented memory on the 970 was such a huge deal that it wouldn't have affected Hawaii sales much, I'd wager the 290 likely would've kept it's $400 price tag, or if AMD really wanted to be competitive, price it the same as the 970.

So then the choices become:

  • $550 980
  • $500 290X
  • $400 or $330 290
  • $330 970 with segmented memory

And if those were the choices I faced, I would've either looked to the second hand market, or just sat it out and not have bought anything, since none of the choices would've been particularly appealing.
Edited by magnek - 2/29/16 at 1:19pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Rumors and Unconfirmed Articles
Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016