Overclock.net banner
581 - 600 of 1,194 Posts
Quote:
Originally Posted by criminal View Post

Yeah I agree. I so hope Vega is awesome. I am getting so board of this 980. I think we are both looking for the same performance upgrade.
thumb.gif
Oh man, i can't wait.
thumb.gif


There better be HDR monitors out around that time frame too..
tongue.gif
 
Quote:
Originally Posted by headd View Post

Stock 980TI is 1100Mhz if you oc it to 1500Mhz it is 36% performance increase
1070 is 10% faster and after oc to 2Ghz there is 5-10% performance increase.

980TI after OC will be 16-21% faster than 1070 after oc.
and the wattage goes up! I dont really OC my vga cards, when they cost twice as much as a cpu and more, I like the card last as long it can, without affecting warranty
thumb.gif
 
Quote:
Originally Posted by ladcrooks View Post

and the wattage goes up! I dont really OC my vga cards, when they cost twice as much as a cpu and more, I like the card last as long it can, without affecting warranty
thumb.gif
Wait....you mean that having the card run cooler and last longer means more to you than going from 93 to 96 fps when you play BattleWitcher 9: Quest for The Holy ePeen?
tongue.gif
 
doh.gif
What good is being able to overclock and overheat things if you don't take advantage of it?
mad.gif
 
Quote:
Originally Posted by ladcrooks View Post

and the wattage goes up! I dont really OC my vga cards, when they cost twice as much as a cpu and more, I like the card last as long it can, without affecting warranty
thumb.gif
A lot of GPUs you can run faster clocks, but still keep cool and without adding voltage.. so there's no downside whatsoever.
 
"Furthermore, Nvidia told us that all the 1070 GTX would be limited to 3 GPC. As we explained before, several configurations were possible. This implies that the 1070 GTX has a maximum rate of 3 triangles made by cycle (against 4 of the 1080 GTX), and his fillrate is limited to 48 pixels per cycle. Although the memory interface is complete with 64 ROP, each GPC can charge only 16 pixels per cycle. Here are the final table of specifications:"

https://translate.googleusercontent.com/translate_c?depth=1&hl=it&ie=UTF8&prev=_t&rurl=translate.google.it&sl=fr&tl=en&u=http://www.hardware.fr/news/14641/gtx-1070-3-gpc-juste-devant-gtx-980-ti.html&usg=ALkJrhhNka0saLFCJ8rmSiOppg14yYp7eg
 
Quote:
Originally Posted by NABBO View Post

I am interested in GTX 1070 8GB (for 400€ max)
but if VRAM is 7GB + 1GB
I Don't Buy it http://abload.de/image.php?img=1i8sjz.gif
If AMD doesn't offer anything in the price-range, it's not the 7GB you should be worried about -- that's plenty on top of a 980 Ti. If you're worried about a 970 fiasco repeat, it's the lower full-duplex bandwidth that you should be worried about, even for that 7GB. With the 970, most people misunderstand that it was only a capacity segmentation. The true problem was segmented bandwidth performance.
 
My plan is ...

...a single GTX 1070 8GB for September ...
Q4 2016 / Q1 2017 second GTX 1070 8GB (300/350 €) for SLI.

...and in 2018 upgraded to new architecture nvidia Volta gpu (GV104 )

I don't want a video card with less than 8GB vram
wink.gif
 
Quote:
Originally Posted by looniam View Post

good question and answering that would be guessing.

iirc the mention of the kepler bug was just the day before release of that driver. (this was on the 352.86 thread if you have the time to look.) whether or not they had the time to document better or intentionally left it vague to seem . . well . . . better; either way, i assure you the vast majority of people didn't take it as proof of kepler gimping but as fact they didn't stop optimizing for kepler.

so is the glass half empty or half full?
wink.gif


btw, sorry if i came off a little adversarial.
Sorry for the late reply, it's hard to keep up with these threads (and the GTX 1080 is at almost 5k posts!).

No problem.

The part about the glass being half empty or half full explains it well and it makes sense when you compare with AMD cards. Take the R9 290X / 290 cards, the drivers came from behind but have been optimized up to this day. Nvidia not optimizing for Kepler at one point in time and thus not extracting the potential that is there in the hardware for new games can be construed as gimping versus optimizing for the new cards. Gimping isn't only deliberately making older games perform worse or new games perform worse even without optimization. It thus all depends on how you look at it, it's also leaving cards behind despite the potential being there. The hardware only performs as well under a higher level API as the driver is optimized to do. You can have all the hardware in the world and not be using it to its full potential. We could of course then proceed to ask whether Kepler would have in certain scenarios any optimization room left vs a comparable AMD card from the same era, that's also an eventual factor, but of course Nvidia won't go there.

A lot of these discussions go around all of the above, both for Nvidia and AMD.
 
Quote:
Originally Posted by tpi2007 View Post

Sorry for the late reply, it's hard to keep up with these threads (and the GTX 1080 is at almost 5k posts!).

No problem.

The part about the glass being half empty or half full explains it well and it makes sense when you compare with AMD cards. Take the R9 290X / 290 cards, the drivers came from behind but have been optimized up to this day. Nvidia not optimizing for Kepler at one point in time and thus not extracting the potential that is there in the hardware for new games can be construed as gimping versus optimizing for the new cards. Gimping isn't only deliberately making older games perform worse or new games perform worse even without optimization. It thus all depends on how you look at it, it's also leaving cards behind despite the potential being there. The hardware only performs as well under a higher level API as the driver is optimized to do. You can have all the hardware in the world and not be using it to its full potential. We could of course then proceed to ask whether Kepler would have in certain scenarios any optimization room left vs a comparable AMD card from the same era, that's also an eventual factor, but of course Nvidia won't go there.

A lot of these discussions go around all of the above, both for Nvidia and AMD.
since you're late replying it looks like i need to refresh your memory of the discussion; which was your claim nvidia admitted to gimping kepler via driver release notes. not about comparing them to AMD drivers - which is entirely another can of worms and imo as irrelevant was comparing clock speeds between them.

i recalled nvidia found a bug effecting TW3 and posted it in the driver feedback thread in their forum. though i looked around and didn't find that specific post i did run accross their statement here
Quote:
Kepler Performance Optimizations

Following end user reports of lower-than-expected performance in The Witcher 3: Wild Hunt when using GeForce GTX 600 and 700 Series Kepler GPUs, we have identified and fixed three bugs that were limiting performance not only in The Witcher 3, but also Far Cry 4 and Project Cars. With the new GeForce Game Ready drivers installed, frame rates are increased in each title, improving and optimizing your experience.

We would like to thank the community for their feedback on this issue. These reports and benchmarks helped greatly in rapidly identifying and resolving the issue. If you have any further feedback following the installation of the new driver, please post it in this thread.
so though it didn't make it to the driver download page or in the release notes PDF; it along with two other bugs i failed to mention (my bad) IT DID make it on a more in depth driver explanation page.

so there is no smoking gun. and to be clear; my comment about the glass being half empty pertained to the bugs being missed in the first place; nothing about leaving performance on the table - which there is absolutely NO PROOF OF!

have a good day.
smile.gif
 
Quote:
Originally Posted by headd View Post

Witcher 3 blood and whine test.1070 looks like crap compare to 1080.1080 have 17% more minimums than 1070 average FPS thats just crazy how bad 1070 is.I think 1080 is better card worth more money.
http://www.purepc.pl/karty_graficzne/wiedzmin_3_krew_i_wino_test_wydajnosci_kart_graficznych?page=0,17
-Snip-
Quote:
Originally Posted by headd View Post

Still GTX1080 is 30% faster and have 63FPS minimum vs 53AVG on 1070.
1070 should be same tier as 1080 but its not.The gap is just too high.

Btw 980Ti runs there at 1165Mhz.OC it to 1500/8000 and it should be pretty close to 1080.

Do you remember GTX670?It was 20% faster than GTX580 and cost less than 1070.
As zealord said the GTX 1070 has a better price/performance ratio and it's better than the specs would suggest.

This is the most cut-down x70 GPU ever which is rather unfortunate and disappointing but it's still able to deliver GTX 980 Ti/Titan X performance for a lot cheaper.

Pascal's overclocking headroom is trash compared to Maxwell at the moment, which is why the GTX 1070 is pale in comparison.

People are trying to lift the limitations in the bios so it will be interesting to see if any improvements can be made.

Pascal is a real disappointment in-terms of overclocking, it's bizarre that most cards can't exceed 2GHz, even the custom cards.
Hopefully it's just an artificial limitation in place that can be removed..
 
Quote:
Originally Posted by ZealotKi11er View Post

For sure. HD 7970 is still good 4.5 years after. I can play almost any game at 1440p with High settings close to 60 fps. 1080p is no problem.
Strange I find 280x problematic at 1200p...
Guess each has a different fps need and games/applications to feed the GPU. 90fps is my sweetspot, with a 120Hz monitor it would be 120+ for sure, but 75-90 works okish for 60Hz to get a decent refresh of not just graphics and less tearing but also faster input handling.
Quote:
Originally Posted by iLeakStuff View Post

EVGA Flounder` Edition is up

http://www.evga.com/Products/Product.aspx?pn=08G-P4-6170-KR
Flounder?
 
Quote:
Originally Posted by NABBO View Post

My plan is ...

...a single GTX 1070 8GB for September ...
Q4 2016 / Q1 2017 second GTX 1070 8GB (300/350 €) for SLI.

...and in 2018 upgraded to new architecture nvidia Volta gpu (GV104 )

I don't want a video card with less than 8GB vram
wink.gif
Or just get the 1080 Ti which should be available sometime around Q1 2017 or maybe Q2 and replace the 1070. Chances are the 1080 Ti will be close to x2 1070a like the 980 Ti is to x2 970s. Spare yourself the issues of SLI.
 
Quote:
Originally Posted by JackCY View Post

Strange I find 280x problematic at 1200p...
Guess each has a different fps need and games/applications to feed the GPU. 90fps is my sweetspot, with a 120Hz monitor it would be 120+ for sure, but 75-90 works okish for 60Hz to get a decent refresh of not just graphics and less tearing but also faster input handling.
Flounder?
I would agree. I am running at 1080p and my 7970 gets smoked by lots of games I have tried to play... They are all playable turned down quite a bit but what's the fun in that?! Witcher 3, Dragon Age Inquisition, Fallout 4, Just Cause 3, and the new Metal Gear all had to be turned down quite a bit to keep the frame rate out of the 30's. I had 2 7970's but guess how many of the games I just listed had Xfire support around launch or didn't have micro stutter issues? Hahaha ZERO. Nearly all of them had to be played with Xfire off. My current monitor does 75hz at 1080P. I want to keep my average frame rate above 75 and the 7970 x2 just hasn't been cutting it for a while.
 
Quote:
Originally Posted by JackCY View Post

Strange I find 280x problematic at 1200p...
Guess each has a different fps need and games/applications to feed the GPU. 90fps is my sweetspot, with a 120Hz monitor it would be 120+ for sure, but 75-90 works okish for 60Hz to get a decent refresh of not just graphics and less tearing but also faster input handling.
Flounder?
Flounder (so it rhymes with Founders), but maybe Whale's edition would be more clear
tongue.gif
 
581 - 600 of 1,194 Posts