Overclock.net banner
7,761 - 7,780 of 18,371 Posts
I have another question for the OCN community about the Titan X.

I've been reading through other threads on google about the Titan X needing a minimum of 24GB of system memory to operate properly and remain stable. what do you think about the matter. I am supposed to have 16GB of system memory but had to split the four sticks into 2x4GB (8GB total) for my personal system and another. so I am currently running on 8GB of system memory. Do you think it is really that necessary to upgrade the memory of my system to 8GB x 4 = total of 32GB of system memory.

Thank you in advance
 
Quote:
Originally Posted by Boyd View Post

I have another question for the OCN community about the Titan X.

I've been reading through other threads on google about the Titan X needing a minimum of 24GB of system memory to operate properly and remain stable. what do you think about the matter. I am supposed to have 16GB of system memory but had to split the four sticks into 2x4GB (8GB total) for my personal system and another. so I am currently running on 8GB of system memory. Do you think it is really that necessary to upgrade the memory of my system to 8GB x 4 = total of 32GB of system memory.

Thank you in advance
Anything above 16GB is overkill for gaming. 16GB is the sweet spot. Titan X does NOT need more than 16GB to operate properly and I'm sure you could get away with 8GB as well.
 
Quote:
Originally Posted by Boyd View Post

I have another question for the OCN community about the Titan X.

I've been reading through other threads on google about the Titan X needing a minimum of 24GB of system memory to operate properly and remain stable. what do you think about the matter. I am supposed to have 16GB of system memory but had to split the four sticks into 2x4GB (8GB total) for my personal system and another. so I am currently running on 8GB of system memory. Do you think it is really that necessary to upgrade the memory of my system to 8GB x 4 = total of 32GB of system memory.

Thank you in advance
8gb of system ram minimum and 16gb recommended:



http://www.nvidia.com/content/geforce-gtx/GTX_TITAN_X_User_Guide.pdf
 
  • Rep+
Reactions: Boyd
Quote:
Originally Posted by atg284 View Post

Hello!

Just got The Witcher 3 though GOG. How do you get the day 1 patch/download though GOG? Looks good but not sure if it is the complete texture package for PC.
doh.gif


TX is performing well on my g-sync 4K though...Thanks a ton for any help!
Yeah, I am thinking the same.. 4k maxed, barely breaks 3.5gb vram..
 
Quote:
Originally Posted by DADDYDC650 View Post

Anything above 16GB is overkill for gaming. 16GB is the sweet spot. Titan X does NOT need more than 16GB to operate properly and I'm sure you could get away with 8GB as well.
Quote:
Originally Posted by MrTOOSHORT View Post

8gb of system ram minimum and 16gb recommended:



http://www.nvidia.com/content/geforce-gtx/GTX_TITAN_X_User_Guide.pdf
Thank you for your responses guys. +1 for each since I couldn't find this info when I looked. OCN community being helpful as usual
thumb.gif
 
Quote:
Originally Posted by deadwidesmile View Post

I personally run 32gb. But, I see almost no noticeable difference better 16 to 32.
Nope, and so you shouldn't. A lot of misconstrued views on what happens and why this is 'recommended'. Unless you've actually exceeded your frame buffer, there is no need for DX to create a system memory copy. There are different temperary buffers within system memory already, but this is different and totally the norm with DirectX.

I wouldn't mind upping to 32GB just because 'I can' but I don't really fancy taking a hit in memory frequency just for e-peen value
 
Hello, i have 2 issues. I hope someone can help me. I have an EVGA Titan X SC with Arctic Accelero IV

The first problem: I flashed the bios to GM200SC-425.ROM to get a higher TDP. After i set it to the maximum (121%), my PSU does a hard reset after a few minutes of FURMARK Burn-In Test. My PSU is a beQuiet Dark Power Pro 10 with 1200W. I tested in Multi- and Singlerail Mode. The temperatures of the GPU are always below 80°, so i think its the PSU. Or what do you think? When the Power Target is set to 110% it works fine.

The second Problem: I can't increase the voltage of the GPU-Core, so i decided to test the GM200SC-MAXAIR.ROM Bios. After i flashed it, i had permanently 1,25v but when i benchmark, the boost clock is between 800 & 1000 MHZ and its always in the powerlimit. Is this a bug in the firmware? What can i do?

Greetings from Dortmund
 
Hows sli with titan x on 1440p gysnc? Is nvidia up to par on sli support? Trying get 144fps on my Acer predator display. I have a 3770k OC to 4.5. Is the second titan x worth the cost with a gysnc monitor?
 
Gotta love those hair physics... Can Ultra Witcher 3 at 2K with One Titan x, but turning on hair-works, casues frame rate to drop below 60 smooth - even with a 780 acting as a phys x card.

Time for a g-sync monitor methinks...
 
Quote:
Originally Posted by dawn1980 View Post

Hows sli with titan x on 1440p gysnc? Is nvidia up to par on sli support? Trying get 144fps on my Acer predator display. I have a 3770k OC to 4.5. Is the second titan x worth the cost with a gysnc monitor?
Not possible. See the vid above n watch the fps.
I reckon need quad sli to achieve 144hz@1440p.

@Silent Scone
How is it on trisli??
 
Quote:
Originally Posted by CL600 View Post

Hello, i have 2 issues. I hope someone can help me. I have an EVGA Titan X SC with Arctic Accelero IV

The first problem: I flashed the bios to GM200SC-425.ROM to get a higher TDP. After i set it to the maximum (121%), my PSU does a hard reset after a few minutes of FURMARK Burn-In Test. My PSU is a beQuiet Dark Power Pro 10 with 1200W. I tested in Multi- and Singlerail Mode. The temperatures of the GPU are always below 80°, so i think its the PSU. Or what do you think? When the Power Target is set to 110% it works fine.

The second Problem: I can't increase the voltage of the GPU-Core, so i decided to test the GM200SC-MAXAIR.ROM Bios. After i flashed it, i had permanently 1,25v but when i benchmark, the boost clock is between 800 & 1000 MHZ and its always in the powerlimit. Is this a bug in the firmware? What can i do?

Greetings from Dortmund
First - do not subject your air cooled Titan X to Furmark. It's nothing but a power virus and will cook the vrms on this pcb. Most of the modified bioses you'll get here control the gpu voltage (along with the NV driver) - the slider in PX (and AB even if you make the right mod to the ven files) does nothing. THe MAX watts our voltage locked (@ 1.274V) can pull is ~425W under the best conditions, so unless the PSU is faulty, you should not trip OCP.

Try uninstalling PX if you are using that. If not, reinstall the drivers after doing a clean sweep with DDU15.
thumb.gif
 
Quote:
Originally Posted by cstkl1 View Post

Not possible. See the vid above n watch the fps.
I reckon need quad sli to achieve 144hz@1440p.

@Silent Scone
How is it on trisli??
Only played the opening but performance seemed reasonable, didn't check the framerate or scaling. Have G-Sync disabled at the moment and a frame cap of 140. Will check later but i doubt it was near that albeit above 60 for sure.
 
Having a good experience playing The Witcher 3 completely maxed out (including hairworks etc) at 4k on my two cards... average fps is around 45 but it feels pretty smooth to me. Game is great so far... easily the best looking open world game to date.
 
Quote:
Originally Posted by Jpmboy View Post

First - do not subject your air cooled Titan X to Furmark. It's nothing but a power virus and will cook the vrms on this pcb. Most of the modified bioses you'll get here control the gpu voltage (along with the NV driver) - the slider in PX (and AB even if you make the right mod to the ven files) does nothing. THe MAX watts our voltage locked (@ 1.274V) can pull is ~425W under the best conditions, so unless the PSU is faulty, you should not trip OCP.

Try uninstalling PX if you are using that. If not, reinstall the drivers after doing a clean sweep with DDU15.
thumb.gif
Hello,

i don't use Precision X. I disabled Afterburner completely but nothing changed. On GM200SC-425.ROM i got ~1.400 Mhz Boost Clock while benching. When i use the GM200SC-MAXAIR.ROM, my Boost Clock (900MHZ) went lower than the standard Core Clock while benching. I removed the drivers and installed them again, but nothing changed.
 
Could someone please help me with the following issues?

1. Selecting 'Auto' in PhysX settings in 'Configure SLI' in NVCP selects the second GPU for PhysX. Does this mean the second GPU will handle only PhysX and nothing but PhysX?

2. Does the automatic driver update feature of GeForce Experience get rid of the junk that the older driver leaves behind? Or does using DDU and installing drivers manually result in a cleaner driver install?

Thank you.
 
7,761 - 7,780 of 18,371 Posts