[Official] NVIDIA Titan X Pascal Owners Thread - Page 50 - Overclock.net

Forum Jump: 
Reply
 
Thread Tools
post #491 of 7609 Old 08-03-2016, 10:06 PM
PC Gamer
 
HyperMatrix's Avatar
 
Join Date: Mar 2012
Location: Canada
Posts: 1,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 96
So I was investigating some FurMark fun. Here's what's happening:

- You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
- As the card heats up, more power is required for the same performance level
- This results in throttling, in order to stay below TDP
- This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.

Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.

Even on air, we're going to need a modified bios to take full advantage of these cards.

Quote:

All links to youtube videos removed. This site feels like a concentration camp. -_-
View the links to my Yamakasi Catleap OC guides for Nvidia and AMD cards here
HyperMatrix is offline  
Sponsored Links
Advertisement
 
post #492 of 7609 Old 08-03-2016, 10:06 PM
AKA Murclocke
 
Murlocke's Avatar
 
Join Date: Dec 2004
Location: Cedar Falls, IA
Posts: 16,659
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 911
Quote:
Originally Posted by ChrisxIxCross View Post

Good lord this thing is like at 50c at idle... My 980 Ti Hybrid which I had before was at like 26c idle and never went above 60c at full load

Mine is idling at 32C right now with 23% fan.

Most likely you have windows set to High Performance instead of Balanced in power saving options. If I do that, chrome tends to just throw it into 3D mode. Check GPU usage and Power % in afterburner. Should be 7% and 0-1%.
Quote:
Originally Posted by HyperMatrix View Post

So I was investigating some FurMark fun. Here's what's happening:

- You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
- As the card heats up, more power is required for the same performance level
- This results in throttling, in order to stay below TDP
- This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.

Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.

Even on air, we're going to need a modified bios to take full advantage of these cards.

Honestly all that program is good for anymore is breaking cards. It's not a realistic scenario and doesn't even bring out unstable overclocks as well as some games. tongue.gif
ChrisxIxCross likes this.

Murlocke is offline  
post #493 of 7609 Old 08-03-2016, 10:09 PM
New to Overclock.net
 
Gary2015's Avatar
 
Join Date: Jun 2015
Location: New York City
Posts: 346
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by DarkIdeals View Post

I'm just REALLY debating between the X34 34 inch ultra-wide 100hz and the Acer XB321HK 32 inch 4K 60hz. They're both nearly identical in most ways; both have G-Sync, 32 and 34 inch is super close in size, both are IPS panel, both have 10 bit color (with FRC+8 bit etc..), both have 4ms response time etc..etc.. but it's between 60hz 4K and 100hz 3440x1440 21:9.

I'm really thinking the 21:9 will be great, but i kinda wonder if two TITAN X pascal isn't a tiny bit overkill for even 3440x1440 100hz. I just did some more testing on Far Cry 4 on my ROG Swift and i can definitely notice a less laggy and less blurry experience when just sprinting and moving camera side to side with ~120hz vs 60hz. I also noticed that aiming/shooting is a fair bit more precise at 120hz than at 60hz; especially when i turn down some settings on the GTX 1080 i had so i could legit push full 100fps+ at 1440p.

That's the ONE thing that i think is keeping me from getting the XB321HK, the fact that it's only 60hz and can't be overclocked. I'm really wondering how the X34 looks like with Nvidia DSR on. 3440x1440 native with DSR would give me a render resolution of 5120 x 2160 (essentially "4k ultra-wide") downsampled to the original 3440x1440, and i'm wondering if that wouldn't actually look relatively close to the sharpness of 4K (especially if i added in a bit of good quality AA to sharpen things farther) while still letting me have the benefit of ultra-wide....decisions decisions lol.

Depends what games you play . I have the 4K one as well but my daily driver is the X34. I will see if it's overkill when I get my cards. I had two SLi 1080 and I wasn't get 100fps in ESO nor GTAV with mods. With BF1 coming out soon I would prefer 100fps constant . With the new Asus 4K 144 hz out next year , it won't be overkill.

With DSR at max my 1080s only did 25fps on ESO.

Asus Rampage V Edition 10
Intel 6800K CPU@4.5GHZ - Silicon Lottery
Samsung SM961 1TB
GSKILL 32GB Ripjaws V 3200Mhz 14-14-14-34
2x Titan X PASCAL SLI
CaseLabs SM8 Magnum
Corsair 1500AXI PSU
Acer Predator X34
CPU Cooling XSPC Bay Rez/HeatKiller 3.0/Black Ice 360GTS RAD/ Darksiders Gentle Typhoon
GPU Cooling XSPC Photon 180/EK Waterblocks/ Black Ice 360 GTS RAD/ Darksiders Gentle Typhoon

Last edited by JTI84; 07-01-2014 at 09:51 AM. Reason: More info
Gary2015 is offline  
Sponsored Links
Advertisement
 
post #494 of 7609 Old 08-03-2016, 10:12 PM
AKA Murclocke
 
Murlocke's Avatar
 
Join Date: Dec 2004
Location: Cedar Falls, IA
Posts: 16,659
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 911
I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.

I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.

Murlocke is offline  
post #495 of 7609 Old 08-03-2016, 10:13 PM
PC Gamer
 
HyperMatrix's Avatar
 
Join Date: Mar 2012
Location: Canada
Posts: 1,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 96
Quote:
Originally Posted by Murlocke View Post

Mine is idling at 32C right now with 23% fan.

Most likely you have windows set to High Performance instead of Balanced in power saving options. If I do that, chrome tends to just throw it into 3D mode. Check GPU usage and Power % in afterburner. Should be 7% and 0-1%.
Honestly all that program is good for anymore is breaking cards. It's not a realistic scenario and doesn't even bring out unstable overclocks as well as some games. tongue.gif

I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA. tongue.gif

Quote:

All links to youtube videos removed. This site feels like a concentration camp. -_-
View the links to my Yamakasi Catleap OC guides for Nvidia and AMD cards here
HyperMatrix is offline  
post #496 of 7609 Old 08-03-2016, 10:16 PM
 
fernlander's Avatar
 
Join Date: Apr 2014
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 0
Quote:
Originally Posted by CallsignVega View Post

Crysis 3 maxed out I'm getting 100-130 FPS and Star Wars Battlefront maxed out 130-160 FPS at 4K in SLI. Look's like the time has come for 4K 120 Hz monitors...

I don't know about maxed out. I tried Crysis 3 and some settings can bring it to its knees. For example 4K plus even medium TXAA will drop it below 60fps.

Also the game looks dated finally. And one more thing. Nothing on earth gets rid of the horrendous aliasing on every stair in that game. Not 4K + high TXAA. Nothing. Those stairs will be aliased forever it seems.

6700K @ 4.8Ghz on Corsair H110. TXP.
fernlander is offline  
post #497 of 7609 Old 08-03-2016, 10:17 PM
New to Overclock.net
 
Gary2015's Avatar
 
Join Date: Jun 2015
Location: New York City
Posts: 346
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by HyperMatrix View Post

I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA. tongue.gif

Would like to see how these cards perform under water.

Asus Rampage V Edition 10
Intel 6800K CPU@4.5GHZ - Silicon Lottery
Samsung SM961 1TB
GSKILL 32GB Ripjaws V 3200Mhz 14-14-14-34
2x Titan X PASCAL SLI
CaseLabs SM8 Magnum
Corsair 1500AXI PSU
Acer Predator X34
CPU Cooling XSPC Bay Rez/HeatKiller 3.0/Black Ice 360GTS RAD/ Darksiders Gentle Typhoon
GPU Cooling XSPC Photon 180/EK Waterblocks/ Black Ice 360 GTS RAD/ Darksiders Gentle Typhoon
Gary2015 is offline  
post #498 of 7609 Old 08-03-2016, 10:18 PM
AKA Murclocke
 
Murlocke's Avatar
 
Join Date: Dec 2004
Location: Cedar Falls, IA
Posts: 16,659
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 911
Quote:
Originally Posted by HyperMatrix View Post

I started investigating after World of Warcraft started throttling the card heavy under 4x SSAA. tongue.gif

You know what's sad? The main reason I bought this card is World of Warcraft. Those new legion graphics are super demanding at 4K. I went from ~45FPS to ~100FPS in the Hyjal coming from the previous Titan X. It's sad you need such a powerful computer to chug through that 15 year old engine, but at least the games looks good with the new graphics.
Quote:
Originally Posted by fernlander View Post

I don't know about maxed out. I tried Crysis 3 and some settings can bring it to its knees. For example 4K plus even medium TXAA will drop it below 60fps.

Also the game looks dated finally. And one more thing. Nothing on earth gets rid of the horrendous aliasing on every stair in that game. Not 4K + high TXAA. Nothing. Those stairs will be aliased forever it seems.

Honestly any form of AA that isn't FXAA or SMAA is likely to kill FPS at 4K. Luckily, in most titles, nothing beyond those is a big improvement.

Murlocke is offline  
post #499 of 7609 Old 08-03-2016, 10:19 PM
New to Overclock.net
 
Gary2015's Avatar
 
Join Date: Jun 2015
Location: New York City
Posts: 346
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Murlocke View Post

I would choose 4K over 3440x1440, and I've owned both. 3440x1440 doesn't work in some games, and can be a headache. 4K simply works. It also looks significantly better.

I've never felt higher than 60FPS is worth it, in most cases I prefer to just increase AA or use SSAA. My opinion is definitely the minority though, I've gamed on my friend's 1080p 144Hz and the entire time I was like "meh" compared to 4K at home.

That's true 21:9 isn't compatible with all games but most of them work. I just like the curved screen. I can't go back to 16:9.

Asus Rampage V Edition 10
Intel 6800K CPU@4.5GHZ - Silicon Lottery
Samsung SM961 1TB
GSKILL 32GB Ripjaws V 3200Mhz 14-14-14-34
2x Titan X PASCAL SLI
CaseLabs SM8 Magnum
Corsair 1500AXI PSU
Acer Predator X34
CPU Cooling XSPC Bay Rez/HeatKiller 3.0/Black Ice 360GTS RAD/ Darksiders Gentle Typhoon
GPU Cooling XSPC Photon 180/EK Waterblocks/ Black Ice 360 GTS RAD/ Darksiders Gentle Typhoon
Gary2015 is offline  
post #500 of 7609 Old 08-03-2016, 10:20 PM
 
DarkIdeals's Avatar
 
Join Date: Nov 2014
Posts: 1,255
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 60
Quote:
Originally Posted by Gary2015 View Post

Depends what games you play . I have the 4K one as well but my daily driver is the X34. I will see if it's overkill when I get my cards. I had two SLi 1080 and I wasn't get 100fps in ESO nor GTAV with mods. With BF1 coming out soon I would prefer 100fps constant . With the new Asus 4K 144 hz out next year , it won't be overkill.

With DSR at max my 1080s only did 25fps on ESO

Hmm....yeah i was thinking that perhaps i'd be better off with the X34 due to demanding games. Like i thought to myself "yes it is GENERALLY overkill, but on like The Witcher 3 at Max settings with Hairworks, or 4K Fallout 4 with max Godrays and 100+ mods etc.. i would probably end up at ~80-85fps or something, so perhaps it'd be worth it to get the X34 anyway"

But then there's the X34P coming out SOMETIME this year (i hate Acer's ambiguous vague estimates...."we think it might possibly sort of be kinda ready around Q4 of next year....i guess?" lmao. The X34P tempts me to not get a monitor yet, as it will have 100hz out of box with overclocking still enabled, an Acer rep said minimum 120hz would be possible, and since it uses Displayport 1.4, assuming the limiation is bandwidth and not the panel, i'm betting that ~130-140hz is more likely. Plus it has the joystick controls like the ASUS ROG monitors, and has a matte back finish, and also has the swivel etc.. stand; and it uses one of the higher quality LG "S-IPS" panels instead of the "AH-IPS" panels that have all the backlight bleed issues.

On a side note....is ESO REALLY that demanding? I mean yeah it's online so there's latency and CPU issues etc.. that effect ALL online games; but i didn't think the graphics and whatnot of ESO put it at the level that you would dip to 25fps! And yeah BF1 is one of the main things making me want to try the X34. I'm never a big FPS player, i enjoy the occasional playthrough of Metro 2033/LL, Far Cry 3/4, Fallout games etc.. but i'm never a competitive CoD, BF, etc.. type player. This is why i thought i might be able to get away with 60hz especially when it's in exchange for 4K, since i mostly play RPG, RTS, etc.. type games like Dark Souls, Dragon Age, Skyrim, The Witcher, Total War etc..etc.. but idk...

Quote:
Originally Posted by HyperMatrix View Post

So I was investigating some FurMark fun. Here's what's happening:

- You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
- As the card heats up, more power is required for the same performance level
- This results in throttling, in order to stay below TDP
- This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.

Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.

Even on air, we're going to need a modified bios to take full advantage of these cards.


Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.

"Meanwhile, in AMorDor, Lisa Su-rons all seing eye, in secret...crafts an ultimate, inneficient, cache sharing processor of evil....yes....one 32nm ancient architecture to rule them all!! Using an army of 5.5Ghz (6ghz turbo) FX 9999 569w TDP 32 modul....er..core...CPUs to sweep down upon the kingdoms of Mentel and the Elven fortress of Nvideandell!" -Excerpt, The Lord of the Cores by C.P.U. Trollin
DarkIdeals is offline  
Reply

Quick Reply

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off