Overclock.net banner

[TPU] NVIDIA GeForce GTX TITAN X To Feature Tweakable Idle Fan-off Mode

4.1K views 42 replies 33 participants last post by  djriful  
#1 ·
Quote:
Taking advantage of the low TDP of GeForce GTX 970 and GTX 980, several NVIDIA add-in card (AIC) partners such as ASUS, MSI, and Palit, innovated their VGA cooling solutions to feature idle fan-off. Such a feature lets the card turn its fans completely off, when the GPU is idling, or is below a temperature threshold, making the card completely silent when not gaming. NVIDIA plans to standardize this with its next-generation GeForce GTX TITAN X graphics card.

Given that its TITAN family of super high-end graphics cards never get to be custom-designed by AICs, NVIDIA has decided to standardize an idle fan-off feature of its own. Unlike AICs, who have used specialized fan-controller chips that take auxiliary temperature input to decide when to turn the fan off, NVIDIA's approach will be more driver-based. Future drivers accompanying the GTX TITAN X will offer a new feature, which when enabled, lets you choose between a non-linear fan curve that keeps the fan off; and one that runs it at low speeds.
Source
 
#31 ·
Quote:
Originally Posted by Gilles3000 View Post

Well yes, obviously. But thats exactly the problem, its pointless to run fanless at idle if you can't sustain low-ish temperatures.
Exaggeration does not translate on the internet well. The issue with MSI's idle off fan feature is that once it reaches a certain threshold, one fan will stay at 100% regardless of the temperature.
https://forum-en.msi.com/index.php?topic=183618.0
 
#33 ·
''Innovation''. I would rather have it run at very low speed just to keep all components as cool as possible. Since the fan consume when running I guess it's a nice way to display extremely low power consumption when idling in review tests.
 
#34 ·
If this Titan thing is going to continue to be a thing, then they need to come up with a different naming convention. The OG Titan made sense. It was a card completely in it's own class and it was named after the Titan supercomputer that was powered by like 18,000-some GK110's.

But now that there has been several Titan branded cards and will be more, it has just gotten stupid. Titan Black, Titan Z, Titan X? What's next? Titan XL? Come up with another branding scheme separate from the numbered GTX line. Make it follow some sort of progression that is logical. Just continuing to tack random marketing buzz words and letters behind the Titan name seems silly to me.

As far as the fan-off feature, I suppose there's no harm in it, although I doubt I would run it that way.
 
#35 ·
Anyone now wondering if GM200 gets released on 28nm as well? ...eventually. Or was that always the prevailing inclination? /checks crystal ball.

Things don't looks so good for 20nm or 16nm for 2015. We already know Maxwell does great on 28nm and if it provides a suitable upgrade percentage, like we've been hearing, would we really care if the atomic level, lithography lanes are 16nm's apart or 28nm?
 
#36 ·
Quote:
Originally Posted by Ghoxt View Post

Anyone now wondering if GM200 gets released on 28nm as well? ...eventually. Or was that always the prevailing inclination? /checks crystal ball.

Things don't looks so good for 20nm or 16nm for 2015. We already know Maxwell does great on 28nm and if it provides a suitable upgrade percentage, like we've been hearing, would we really care if the atomic level, lithography lanes are 16nm's apart or 28nm?
I guess they could release the titan on the 20mm or 16mm seeing as they know they won't make that much plus they will get alot of $$ for them then wait till Q4 to release the gaming cards etc, that would make sense.. I hope
smile.gif
 
#37 ·
Quote:
Originally Posted by Ghoxt View Post

Anyone now wondering if GM200 gets released on 28nm as well? ...eventually. Or was that always the prevailing inclination? /checks crystal ball.

Things don't looks so good for 20nm or 16nm for 2015. We already know Maxwell does great on 28nm and if it provides a suitable upgrade percentage, like we've been hearing, would we really care if the atomic level, lithography lanes are 16nm's apart or 28nm?
I expect 28 nm, but I have zero information other than the rumors posted here so that means nothing.

I think GM200 will be a nice upgrade over a GK110, but it won't bring the generation to generation improvements we've seen in the past simply because it won't have a die shrink over the previous generation. It's been pretty standard to see each generation double it's predecessor (GT200 -> GF110 -> GK110), but all of those generational leaps had the advantage of a die shrink.

So on one hand, it's disappointing to have no die shrink because GM200 likely won't be relative to GK110 what GK110 was to GF110 or what GF110 was to GT200. On the other hand it's impressive because it should still be 60-70% faster than GK110 while being on the same process. All in all, it's pretty crazy what Maxwell would have been with a die shrink.
 
#38 ·
Please be good, please be good....
 
#39 ·
lachen.gif
I'd be shocked that people seem to be under this magical delusion that idle power consumption and heat output magically changes based on whether the fans are running or not, and that running them somehow is "good" for the card/component to where they think having extra noise being made when it provides quite literally zero, nil, nada, zilch, none whatsoever, no benefit at all.... but this is the internet.
tongue.gif
 
#41 ·
Quote:
Originally Posted by i7monkey View Post

Nobody wants you, Titan. You cost too much and perform the same as your gamer version.
I find the vram helps a lot, I had the 780ti and I had stuttering issues because my vram usage @ 1440p/4k was right @ or just above 3 gb of vram. I bought the Titan Black, stutters be gone and I get a better, stable OC. It definitely wasnt worth the $1000 for the Titan Black but that extra vram has helped immensely especially with modded Skyrim taking up 4.1gb of vram after compressing the game files. (was 5.5 gb before that). Even now the 980 would get stutters because of my Skyrim's vram load even though its more powerful so from my perspective the 6gb was worth it as there isnt anything on the market for green that has 6gb of vram (I wont go red because I dont like unusually high temps, I'd rather pay the premium and get stability).