[AdoredTV] Pascal vs Maxwell at same clockspeeds, same FLOPS (1080 vs 980 Ti) - Page 18 - Overclock.net

Forum Jump: 
Reply
 
Thread Tools
post #171 of 305 Old 07-28-2016, 12:49 PM
Overclocker
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 5,224
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 326
Quote:
Originally Posted by DarkIdeals View Post

Add in the fact that AMD will have to perfect a massive 650mm2 die to cram 6,144 shaders into it;

... you must mean 550(ish)mm2. Something I see you doing in a lot of these posts is completely not caring about how process node shrinks the size.
Which, going back to some of your previous rants, a GTX 980 has a 250mm2 die size when you rough estimate it to the new process node size. So all your "YOU MUST COMPARE THIS TO THIS" stuff about the 980 and 1080 being the same die size is wrong. The 980Ti actually has only a slightly larger die size when the GPUs are normalized properly.

EniGma1987 is offline  
Sponsored Links
Advertisement
 
post #172 of 305 Old 07-28-2016, 12:54 PM
New to Overclock.net
 
Blameless's Avatar
 
Join Date: Feb 2008
Posts: 29,123
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 3131
The biggest thing that bothers me about this thread is also the biggest thing that bothered me about the video...

The apparent presumption of many that shader performance is the be all and end all of graphics performance.

...rightful liberty is unobstructed action according to our will within limits drawn around us by the equal rights of others. I do not add 'within the limits of the law,' because law is often but the tyrant's will, and always so when it violates the right of an individual. -- Thomas Jefferson
Blameless is offline  
post #173 of 305 Old 07-28-2016, 12:54 PM
New to Overclock.net
 
Silent Scone's Avatar
 
Join Date: Nov 2013
Posts: 11,214
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 392
Quote:
Originally Posted by Robenger View Post

That was a long and drawn out way to say no. If you watch the whole video he adjusts for CUDA cores and then adjusts for clock speeds. The clock vs clock is only the first part of the video.

If you don't understand how CPU and GPU differ, then I can see how you'd think it's benificial. smile.gif
Silent Scone is online now  
Sponsored Links
Advertisement
 
post #174 of 305 Old 07-28-2016, 01:02 PM
 
Exeed Orbit's Avatar
 
Join Date: May 2015
Posts: 422
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 25
Quote:
Originally Posted by ChevChelios View Post


I think you have a selective reading issue. He said 1080p. This is QHD. Would you like to try again?

You quoted him, and decided to leave out the part where he says 1080P, just so you could shove your nonsense into the conversation again.
Exeed Orbit is offline  
post #175 of 305 Old 07-28-2016, 01:03 PM
New to Overclock.net
 
Blameless's Avatar
 
Join Date: Feb 2008
Posts: 29,123
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 3131
Quote:
Originally Posted by Silent Scone View Post

If you don't understand how CPU and GPU differ, then I can see how you'd thinkit's benificial. smile.gif

So, what intrinsic differences between CPU and GPU do you think make architectural or IPC comparisons less relevant on the later than on the former?
Robenger and LAKEINTEL like this.

...rightful liberty is unobstructed action according to our will within limits drawn around us by the equal rights of others. I do not add 'within the limits of the law,' because law is often but the tyrant's will, and always so when it violates the right of an individual. -- Thomas Jefferson
Blameless is offline  
post #176 of 305 Old 07-28-2016, 01:04 PM
Amiga 500
 
Join Date: Nov 2009
Location: Turkiye
Posts: 6,565
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 267
Quote:
Originally Posted by Dargonplay View Post

1-) You didn't sued Nvidia and you're probably 15 years old.
2-) Voltage doesn't follow a curve but that's irrelevant, if more wattage is needed the card will down lock itself which is why he did it, he didn't changed voltage, he changed the power target, this is also why the stock FE 1080s downclock to 1.35GHz on furmark with stock settings.
3-) You can't downclock a 1080 that far without lowering the power target because of GPU Boost 3.0, you saying otherwise doesn't only prove you're ignorant on the matter but it also proves you're ready to talk about things you just don't know anything about.
4-) Turning OFF GPU Boost 3.0? I guess that turns you into GPU fool 3.0 because GPU boost is implemented at a hardware level and can't be turned off under any circumstance.

You assumed he was biased based on your own ignorance, enough said.
Don't presume he isn't what he says to be without any evidence. Surely you can disable Gpuboost, but this fishing expedition couldn't end soon enough.

mtcn77 is offline  
post #177 of 305 Old 07-28-2016, 01:07 PM
 
Rei86's Avatar
 
Join Date: Oct 2012
Posts: 1,995
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 67
Anyone got some subtitles?

Rei86 is offline  
post #178 of 305 Old 07-28-2016, 01:10 PM
Overclocker
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 5,224
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 326
Quote:
Originally Posted by Rei86 View Post

Anyone got some subtitles?

GREEN TEAM!

RED TEAM!

NOT FAIR!!

YES IT IS!!

BUT NEW GREEN IS BETTER THAN OLD GREEN!



Pretty much the whole thread.
CasualCat likes this.

EniGma1987 is offline  
post #179 of 305 Old 07-28-2016, 01:13 PM
 
tkenietz's Avatar
 
Join Date: Aug 2014
Posts: 510
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 26
Quote:
Originally Posted by ChevChelios View Post

perfrel_1920_1080.png

Your link cites tpu with 7% difference. Tpu themselves say 2%
tkenietz is offline  
post #180 of 305 Old 07-28-2016, 01:14 PM
New to Overclock.net
 
Dargonplay's Avatar
 
Join Date: Oct 2012
Location: Florida
Posts: 1,490
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 159
Quote:
Originally Posted by mtcn77 View Post

Don't presume he isn't what he says to be without any evidence. Surely you can disable Gpuboost, but this fishing expedition couldn't end soon enough.

You can't disable GPU Boost 3.0, I have a 1070 and I've researched for weeks ways to disable it, the only way I've found is to strip down the sensors themselves but if you do it wrong you risk damaging the entire card.

If you or anyone happen to know any way to disable GPU Boost 3.0 be my guest, prove me wrong, can't do that? Yeah, just like I can't disable 3.0 BOOST, it will always be working to some degree and the only way to control it is by adjusting several performance caps like Power, voltage, temperatures.

You just can't force Pascal to work at a fixed frequency of 2100MHz or even 1900MHz without variation unless you have power target set to extreme for a 1800MHz overclock while temperatures aren't an issue, but that's not disabling GPU boost 3.0, that's controlling it, just like AdoredTV did.
Dargonplay is offline  
Reply

Quick Reply

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off