Originally Posted by ESP
Well thats a bit odd, I was expecting it to be all around faster then AMD. It makes sense though, because Nvidia only seems to be faster then AMD in very particular games, which makes me think it has more to do with them optimizing their cards for those games, or optimizing those games against AMD cards.
Am I the only one that finds it odd that Nvidia performance is so drastically different from game to game?
Also, while Nvidia was the first to hop on GPU Compute architecture, it would be a bit pre-mature to sacrifice current game performance for future game performance. That doesn't really make any sense. Unless the reason is will run faster on DX11 games is because of them having a hand in those game development...
Name one DX10/DX9 game that isn't able to be maxed out on current cards. While not all are above 60fps minimum, most new games are going to be DX11. BF3 showed you can have a game DX10/11 only and still sell a lot
of copies, you can bet that other companies are thinking hard about that.
Originally Posted by Murlocke
So the GTX 680 is smaller, roughly the same speed, probably requires less power, and is likely to be much cheaper than the 7970.
Way to go NVIDIA. I might buy two depending on power consumption.
According to many sources the card will be around $350. That would put this card at a much better buy than anything ATI has on the market. What are you seeing that i'm not?
This is a score for NVIDIA if they launch at the prices rumored. A big score. It renders the 7950 and 7970 terrible buys unless ATI lowers their prices. Equal, and sometimes better, performance for nearly half the price.
QFT. This chip is tiny for nVidia cards, 28nm is treating them really well. And it'll be like Fermi, a lot of hate for the 6 series but when nVidia refresh it to the 7 series, people will love it.
Originally Posted by test tube
Three weeks ago it was destroying the 7970, now it's faster in some cases? ;\
Keep in mind that at $350 it'd be competing against HD6970s/HD7870s, not the HD7970 at current prices. Not to mention destroying doesn't always mean performance only, a HD4870 destroyed a GTX 260 simply because it was similar performance for a lot
less and finally, a lot of OCN members took "Kepler is better than HD7970" to mean "IT DESTROYS IT"
Originally Posted by Sapientia
If it outclasses Tahiti, don't expect $350. I'd be absolutely shocked if that happened. It'll probably fall in between the 7950/70 in most cases, and it will be priced to match them.
Why not? nVidia might want to take AMD a peg or two down, they've lost a lot of marketshare recently. (It went from like 70/30 to 50/50 in the past few years)
The HD4870 also disproves that statement, it beat the original GTX 260 and was priced much
(ie. $200) lower.
Originally Posted by ESP
Does that mean that a 7870 is a way better buy? Because it is exactly what you just described. We've got benchmarks and everything for that card.
Except the HD7870 doesn't beat the HD7970...This is meant to, from current rumours.
Originally Posted by j0zef
I dont believe the $350 price tag for multiple reasons:
1) It will undercut the price of GTX 580
2) 7970s are selling out just fine now, why should they price it that much lower?
3) When they have the lead, they never underprice their cards. Why would they start now?
I think someone misheard the $350 for this card, and it was actually meant for GK108 or 106. Most likely they will price it at the same price that GTX 580 debuted at.
But what I really want to know is the OC ability of these cards. 7970 & 7950 are amazing OCers.
1) And? It's practically an EoL card now. nVidia will have stopped/be stopping production of them if Kepler is this close to launching, or be lowering the RRP dramatically if they plan to use it as a lower end card like the 9800GTX+ was to the GT200 series. It has happened before, remember the 8800GT that beat a 8800GTS for far less money?
2) They'd sell a lot more, think about it..If I could get near HD7970 performance in most games, and better performance in some for half the price with no major negatives, why would I even consider AMD?
3) It might not be the definitive lead, and this is GK104 remember..GK100 is still yet to be seen, and that's the real GTX 680.
AMD launched the HD4870 for way less than the GTX 260 it matched, because they were late to the market and needed/wanted to gain market and mindshare as much as possible. Considering nVidia has gone from a 70% discrete marketshare to 50%, they'll be looking to fixing that...I doubt $350 too, maybe for a cut down GTX 670, but for a GTX 680? $400-$450 I think. They want to undercut AMD if they can still make a good profit from the cards.
Originally Posted by Dustin1
$1,100 on two 7970's in CFX w/ crappy drivers..
Oh look, it's AMDs apparently crappy drivers again.
Originally Posted by Banzai?
Doesn't the 5/6/7000 series from AMD have the advantage of DX10.1, or was that actually useless unless games 'featured' DX10.1?
DX10.1 was DX10 with the features nVidia asked Microsoft to cut so they could launch G80 faster, so it has additional stuff in it that requires the developers to make use of. So just like the DX10.1 increase in DX10.1 games, the DX11 performance increase from nVidia would only apply in DX11 games. Which, because all major gaming cards support it, will be much more heavily utilized than DX10.1
Originally Posted by ZealotKi11er
Fermi was faster then and until 79xx. This was due to better better dx11 performance. I don't see how this will be faster in dx11 when 79xx is already overkill in terms tessalation which is really the only big thing with dx11
There's more than just the tessellator for tessellation performance.
Originally Posted by Redwoodz
One thing you can be sure of,nVidia will price it where it performs.There will be no "magic" $350 GPU.
The 8800GT never existed, right? It cost $300. It beat nearly everything out when overclocked, apart from nVidia's absolute top of the line 8800Ultra. Even stock, it was in spitting distance of a 8800GTX.
Originally Posted by ESP
That sounds quite a fit like something an AMD fan would say, but definitely not something an Nvidia fan would say. Now we know who you normally buy lol
I've had mostly nVidia GPUs by far, yet I'll happily go AMD. If you really want proof at my impartialness to brand, look at how much I defend AMDs drivers.
Originally Posted by lordikon
1.) Your entire post is "rediculousness". It's spelled "ridiculousness", the word "cant" has an apostrophe in it because it's a contraction, and your use of the word "its" would have an apostrophe in it, "it's" is a contraction for "it is". At least you ended the sentence with a period. My point here is that it's hard for anyone to take you seriously when your grammar sucks.
2.) It has been less than 2 months, not 4.
Not to mention, nVidia still got quite a few sales with Fermi and that was 6 months late.
Originally Posted by jtom320
Hate to say it but if this is true it's probabally an overall loss for Nvidia. The vast vast vast majority of games are still DX9 and will continue to be DX9 for the forseeable future. In fact the only notable DX11 games are Metro, Battlefield, and Crysis 2.
Notice that BF3 sold massive amounts and its DX10/11 only? Other companies are definitely looking at that. Plus, even if it matches a HD7970 or is a little slower..Well, I'd rather have a little future proofing over slightly faster DX9 performance, especially considering my OCed 470 is nearly enough to max out every DX9 game, let alone an OCed 7970.
Originally Posted by jtom320
Yeah I don't agree. Dx9 and Dx11 Arkham City look very similar. Brighter lights and some bumpy objects are about the only difference as you said. If the game was truly built from the ground up using DX11 it would be stunning no doubt but we won't be seeing games like that till 720 and PS4 arrive and it's financially responsible to do it.
You do realise that DX10 and DX11 are mainly about increasing efficiency? So you can get higher FPS with the same sized textures, polygons, etc. The idea being that developers can then use higher poly meshes and higher resolution textures in the same performance segment than before.
Originally Posted by Penryn
Yea so this releases and next month we get Tenerife XT so we can watch NVidia dig themselves out of a hole again. It has kinda been over for NVidia since AMD pulled out the price/performance strategy back in the HD 4870/ GTX 280 days. And considering this thing is *10%* faster than a 7970, a mild overclock should overcome that, granted we don't know how Kepler OCs but knowing NVidia I doubt it will get an easy 200-300mhz bump on air like the 7970s unless they're being REALLY conservative with that *rumored*705/1410 mhz clock setup.
Considering a 470 can easily get a 200Mhz clock bump... (607Mhz stock, 850Mhz OCed for my 470. That's fairly conservative too) I'd think its fairly possible.
Remember, nVidia has hotclocked shaders unlike AMD; a 150Mhz core speed increase is a 300Mhz shader speed increase.
Originally Posted by Zero4549
Am I the only person who couldn't care less if a card is better or worse or doesn't even run DX11?
Give me a card that runs DX6 through 9 efficiently and supports 10 (even if its not particularly great at it) and I'd be happy. There are all of like what.. 3 games that actually use DX11 for anything important? And only 2 that I give a rats arse about.
As sad as it is, that isnt going to change until new consoles are launched. I've given up on wanting what is technologically superior. Now I just want what works in the real world.
Your GTX 295 is about 3 years old now, in 3 years time DX11 will be much more prevalent. If you got that card you'd be upgrading much sooner than a card that supports DX11 well, even if it goes unused.