Overclock.net › Forums › Graphics Cards › NVIDIA › Nvidia gpu boost 3.0 and throttling
New Posts  All Forums:Forum Nav:

Nvidia gpu boost 3.0 and throttling - Page 4  

post #31 of 42
Quote:
Originally Posted by Rei86 View Post

Why do people on any computer harddware forums think you can every buy hardware that's future proof? Its like asking time to standstill wink.gif
.
BANG

https://www.youtube.com/watch?v=xhuC8Tf9i3I

This is not future proof stuff?
post #32 of 42
Thread Starter 
Quote:
Originally Posted by slavovid View Post

It does not drop below the base clock but it's advertised with it's boost clock so it is below at what it is advertised.

When after 20 minutes of play in most of the games on the list the card goes to something around 1680 on average. That might be a little over the base but is far from the advertised boost clock.

And it doesn't matter if for the first 10-15 minutes it can run at +100 more mhz when it drops down after .... For a gamer those 15 minutes are often wasted in preparation before the game itself biggrin.gif

I don't like cheap tactics and the Boost clock is one. It is not a god damn car that drives fast at the start and overall adds to the average speed of the travel time.

As I have shown (multiple times now), the boost clock speed is advertised as the maximum frequency. It allows the card to go faster than the advertised base frequency with certain workloads. 1,680 MHz is more than NVIDIA promised you, which is 1,607 MHz. Anything more than that is bonus, not 1 MHz over 1,607 MHz is promised to you.

If they didn't advertise a limit and simply sold it as 1,607 MHz which dynamically auto overclocks itself in the as far as possible while staying within the predefined thermal and power envelopes, in the background without any input from you, would you have a problem if it "only" did 1,680 MHz? You wouldn't, as that is 73 MHz more than you were promised. So why the outcry when NVIDIA tells you upfront what the limit for the dynamic auto overclock is? To me that just shows you have an extremely limited understanding of its purpose and how it works.

Once again I ask, if it were a guaranteed sustainable speed why would it exist at all? Just increase the base frequency to the boost frequency.
post #33 of 42
Quote:
Originally Posted by Oj010 View Post

As I have shown (multiple times now), the boost clock speed is advertised as the maximum frequency. It allows the card to go faster than the advertised base frequency with certain workloads. 1,680 MHz is more than NVIDIA promised you, which is 1,607 MHz. Anything more than that is bonus, not 1 MHz over 1,607 MHz is promised to you.

If they didn't advertise a limit and simply sold it as 1,607 MHz which dynamically auto overclocks itself in the as far as possible while staying within the predefined thermal and power envelopes, in the background without any input from you, would you have a problem if it "only" did 1,680 MHz? You wouldn't, as that is 73 MHz more than you were promised. So why the outcry when NVIDIA tells you upfront what the limit for the dynamic auto overclock is? To me that just shows you have an extremely limited understanding of its purpose and how it works.

Once again I ask, if it were a guaranteed sustainable speed why would it exist at all? Just increase the base frequency to the boost frequency.

What slavovid points is, that clocks are "up to 1733MHz", this suggest that the GPU dynamically reaches that clock but it also suggest that it may not sustain that clock.

After all, after 20min. the 1733MHz is reached in rare cases. If compuerbase.de is to be believed in 16 (sixteen) out of 22 (twenty two) cases, the clock sustained is under 1699MHz - that not small percentage. So it's more accurate to say, that Boost clock is 1699MHz and "up to 1733Mhz". .

In GTX 1080 white paper it's written as "With 2560 CUDA Cores running at speeds over 1600MHz in the GeForce GTX 1080" - which I believe is what slavovid wants to see stated from Nvidia.

Anyway, I'm not going [OT] anymore.
Edited by C2H5OH - 6/9/16 at 10:29am
post #34 of 42
Thread Starter 
Quote:
Originally Posted by T1A1 View Post

What slavovid points is, that clocks are "up to 1733MHz", this suggest that the GPU dynamically reaches that clock but it also suggest that it may not sustain that clock.

After all, after 20min. the 1733MHz is reached in rare cases. If compuerbase.de is to be believed in 16 (sixteen) out of 22 (twenty two) cases, the clock sustained is under 1699MHz - that not small percentage. So it's more accurate to say, that Boost clock is 1699MHz or "up to 1733Mhz". .

In GTX 1080 white paper it's written as "With 2560 CUDA Cores running at speeds over 1600MHz in the GeForce GTX 1080" - which I believe is what slavovid wants to see stated from Nvidia.

Anyway, I'm not going [OT] anymore.

1. Boost is not up to 1,699 MHz, it is up to 1,733 MHz. If it were 1,699 MHz we would never see 1,700 MHz or above.
2. What you're asking for is EXACTLY what NVIDIA has stated. Visit the GPU Boost page, it is clearly outlined that the boost frequency is "up to."
3. This has been the case since Keplar (2012), why the outcry now?
post #35 of 42
Thread Starter 
I'm going to remove some of the text to really bring your attention to the important bits:
Quote:
every application and game runs at a guaranteed, minimum Base Clock speed. If there’s extra power available, a Boost Clock is enabled increasing clock speeds until the graphics card hits its predetermined Power Target... This dynamic clock speed adjustment... maximizing performance in each and every application.

Honestly if people don't have a fundamental understanding of GPU Boost they have no business basing their purchasing decisions off of the boost frequency. Would you buy a car that has a boost of 47 per second? You don't need to understand what, just accept that it's 47/sec. Buying a GPU on clock speed alone is already a severely flawed decision, the type of person who knows that shouldn't just in and choose based on boost frequency.
post #36 of 42
Quote:
Originally Posted by Oj010 View Post

Then I'd say there's an obvious NVIDIA bias, wouldn't you?

That's why all the outcry thumb.gif
post #37 of 42
Thread Starter 
Quote:
Originally Posted by T1A1 View Post

That's why all the outcry thumb.gif

Except the outcry right now is aimed at NVIDIA, not the biased sites in question. NVIDIA has done nothing wrong, they're offering exactly what they advertise. I'll say the same for the R9 290X - even though they didn't advertise its base frequency they DID say it was "up to."
post #38 of 42
Quote:
Originally Posted by Oj010 View Post

Except the outcry right now is aimed at NVIDIA, not the biased sites in question. NVIDIA has done nothing wrong, they're offering exactly what they advertise. I'll say the same for the R9 290X - even though they didn't advertise its base frequency they DID say it was "up to."

[OT]
Similarly, if you sign a contract with your Internet supplier for 100Mbit connection, based on suggestion from reviews, but you receive the full 100Mbit only the first 20min. of every day, who are you going to blame first? The review (sites,person etc.) or the company supplying your internet for false advertising?
post #39 of 42
Quote:
Originally Posted by Oj010 View Post

Except the outcry right now is aimed at NVIDIA, not the biased sites in question. NVIDIA has done nothing wrong, they're offering exactly what they advertise. I'll say the same for the R9 290X - even though they didn't advertise its base frequency they DID say it was "up to."

Not that I really care, but they did make it seem like the Founders Edition would be the "good" version, when in reality it was really the only version and just a plain old reference card with a cooler that is clearly garbage for the tech. They did so knowing they barely had any stock and were able to charge $100 more(though they don't force anyone to buy) knowing they would sell out instantly. It's not that hard to get.

Everyone knew the 290s were garbage with the AMD reference cooler as soon as you read a review.
Computer
(9 items)
 
  
CPUMotherboardGraphicsRAM
AMD Ryzen 5 1400 3.8Ghz Gigabyte AB350M-HD3 MSI Geforce GTX 1060 3GB Gaming X G.Skill 16GB(2x8GB) 3200Mhz DDR4 
Hard DriveOSMonitorPower
SSDs and HDDs Windows 10 x64 Pro Acer G7 G227HQLbi @75Hz EVGA 430w 
Case
Deepcool Frame 
  hide details  
Computer
(9 items)
 
  
CPUMotherboardGraphicsRAM
AMD Ryzen 5 1400 3.8Ghz Gigabyte AB350M-HD3 MSI Geforce GTX 1060 3GB Gaming X G.Skill 16GB(2x8GB) 3200Mhz DDR4 
Hard DriveOSMonitorPower
SSDs and HDDs Windows 10 x64 Pro Acer G7 G227HQLbi @75Hz EVGA 430w 
Case
Deepcool Frame 
  hide details  
post #40 of 42
Quote:
Originally Posted by prznar1 View Post

BANG

https://www.youtube.com/watch?v=xhuC8Tf9i3I

This is not future proof stuff?

No and yes. We have performance gains with each CPU release (questionable about 6950X). Each year we've also seen gains on the GPU side.

The main issue ATM is that majority of the normal programs that we use on a daily basis (OS, office, web etc) now no longer require high mid end ish hardware to run effectively now. We're no longer on mess like Windows Vista and ram hungry XP.
average
(11 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790K Asus Z97 TUF Gryphon Sapphire Radeon RX Vega 64 G.Skill TridentX 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO Samsung 850 EVO NZXT Kraken X61 Windows 10 
MonitorPowerCase
LG 24GM77 Corsair AX1200i Phanteks Enthoo EVOLV ATX 
  hide details  
average
(11 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790K Asus Z97 TUF Gryphon Sapphire Radeon RX Vega 64 G.Skill TridentX 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO Samsung 850 EVO NZXT Kraken X61 Windows 10 
MonitorPowerCase
LG 24GM77 Corsair AX1200i Phanteks Enthoo EVOLV ATX 
  hide details  
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: NVIDIA
This thread is locked  
Overclock.net › Forums › Graphics Cards › NVIDIA › Nvidia gpu boost 3.0 and throttling