Overclock.net › Forums › Intel › Intel Motherboards › Asus Z9PE-D8 Owner's thread
New Posts  All Forums:Forum Nav:

Asus Z9PE-D8 Owner's thread - Page 242

post #2411 of 2425
Quote:
Originally Posted by SRPlus View Post

Yeah I know the history.. I had a pair of GTX 580 3GB before switching to the Titans. I followed all the Titan launches since, but didn’t think the investment was worth it until now. E.g. the Titan X Maxwell was around 46% faster than my cards for GPU rendering with Iray, but it would cost 4K to replace my existing GPU’s. Now one Titan Xp is twice as fast as one of my old Titans for GPGPU so its tempting to upgrade.

Titan launches have always been around the £800 mark from the right reseller in the UK or a little less, until Pascal hitting the 1.2K mark ( currently £1,159 here), so Nvidia launching the Ti with just 1GB less soon after, then a full-fat version just 6 month later would really put my back up the wrong way. Soo glad I didn’t buy the first TXP.

As you say, only if AMD pull out a Titan killer, (which I seriously doubt ) will Nvidia respond with another faster GTX/Titan Pascal this year, or accelerate the Volta launch.

As you know the Z9PE-D8 has the two CPU’s with 40 pci-e lanes each, CPU 1 controlling slots 1-4, CPU 2 controlling slots 5-7.

I already have 4x GTX Titans in 4-way SLI, and a dual PSU set up, 2x Antec HCP 1300 using the OC link to communicate load balance, and touch laminated wood the system has performed flawlessly for years, iray rendering using 4x GPUs+CPU‘s and gaming. Each card runs at 16x pci-e 3.0 so x16,x16,x16,x16 unlike the motherboard manual would suggest.

I know people have had a lot of problems with this MoBo and it being temperamental about which cards will run and in which slots, but luckily I’ve had non of that.

So I am hoping I can just swap my old Titans for 4x Titan XP’s and use the 4-way HPC EVGA bridge I already have without issue. But it would be nice to take advantage of SLI in games, not just benchmarks. So was wondering if my option 1 would work….

I know of a guy running 4x Titan Pascal ( old ) on a Z10PE-D8 WS running 4-way sli, again x16,x16,x16,x16 gen3, that actually got 4-way to work in some games using an Nvidia inspector hack.

I could test to see if I can use a 2-way bridge on GPU’s 1&2 in SLI, GPU’s 3&4 not used on my current set-up, but this system is my bread a butter, how I pay the bills each month, so very cautious about trying things that might brick the system..

absolutely you can no different from my quad titan xms no different from your quad titans 4way sli works on these Xps with the included 4way bridge and i am sure it works better with the HPC EVGA bridge.. exactly the only issue is getting games to play nice past 2way sli.. since ive tested 1080ti sli if you have the means i wholeheartedly suggest going 7700k/z270/1080ti sli for pure silky fast gaming bliss.. 1080ti is castrated in every way possible but it dances around titan xp like a finely tuned car going thru corners as if they aint there and titan xp a constant voltage/clocks/loads roller coaster that has to be at least hybrid cooled and definitely needs hb bridge whereas 1080ti no need for hybrid cooling no need for hb bridge out of the box

something to be aware of.. pascal is basically a kepler 2.0 architecture.. maxwell (gm200) the 2 indicates how advanced the architecture and is the full realization of kepler (3072cc) the crown jewel what kepler couldnt be.. remains to be seen if volta will be to pascal what maxwell was to kepler if thats the case we could see a "gv200" volta at 4096cc/4096bit hbm2 on 16nm again they could release it if they wanted/if competition forced them to.. just in case you want to wait and see whats coming this summer/november/next march

those dual antec hcp 1300 nice! i got dual silverstone zm1350
post #2412 of 2425
Quote:
Originally Posted by NapalmV5 View Post

absolutely you can no different from my quad titan xms no different from your quad titans 4way sli works on these Xps with the included 4way bridge and i am sure it works better with the HPC EVGA bridge.. exactly the only issue is getting games to play nice past 2way sli.. since ive tested 1080ti sli if you have the means i wholeheartedly suggest going 7700k/z270/1080ti sli for pure silky fast gaming bliss.. 1080ti is castrated in every way possible but it dances around titan xp like a finely tuned car going thru corners as if they aint there and titan xp a constant voltage/clocks/loads roller coaster that has to be at least hybrid cooled and definitely needs hb bridge whereas 1080ti no need for hybrid cooling no need for hb bridge out of the box

something to be aware of.. pascal is basically a kepler 2.0 architecture.. maxwell (gm200) the 2 indicates how advanced the architecture and is the full realization of kepler (3072cc) the crown jewel what kepler couldnt be.. remains to be seen if volta will be to pascal what maxwell was to kepler if thats the case we could see a "gv200" volta at 4096cc/4096bit hbm2 on 16nm again they could release it if they wanted/if competition forced them to.. just in case you want to wait and see whats coming this summer/november/next march

those dual antec hcp 1300 nice! i got dual silverstone zm1350

The reason I will stick with and most likely stay with a dual Xeon set up in the future is because the extra CPU cores make a huge difference rendering with V-ray/ mental ray. A 4 day animation render using a i7 will be cut down to 2 days using my aging 2687w Xeons, and the newer high end V4 Xeons are twice as fast as my old set up, so if I have a client wants the renders done in 2 days for a presentation, no problems, but a single socket i7 could mean delays.

If I upgrade to TXP's I will definitely water cool them. Unlike gaming where the load is distributed equally in SLI giving fairly low temps 50/60c, when rendering in Iray the GPUs are at a constant 100% load until the render completes. With 4x Titans that could be anything from 8 minutes to 30 minutes depending on the complexity/materials in scenes for a noise free render. So they get hot fast, and its already proven that the TXP stock cooler is not up to the job of keeping temps bellow to 70c mark at a constant 100% load, for longer life. Titans are proven to fail in render farms unless their temps are managed well, and kept low. I have had to ramp up the fans of the GPU's to 85% before render, and open windows sometimes in the summer, to keep the temps down, even though I have a huge case with industrial fans. So defiantly going all liquid the next build for a quieter life. If it were just for gaming I would not bother, but this systems priority is production rendering, gaming if I get the time..

I'm fairly confident now that I can just drop in 4x TXP's without issue, just would have been nice to take advantage of the HB bridge for gaming and leave the remaining 2 cards non SLI for GPGPU work.

I bought the Antec HCP PSU's because they communicate the load balance between the two PSU's. I have read and experience myself PSU failure just doing the wired hack to make 2 PSU's turn on at the same time, without the load balance communication running 4 high end GPU's, it eventually kills the PSU's.

Did the HB bridge make a difference to performance?
post #2413 of 2425
Quote:
Originally Posted by SRPlus View Post

The reason I will stick with and most likely stay with a dual Xeon set up in the future is because the extra CPU cores make a huge difference rendering with V-ray/ mental ray. A 4 day animation render using a i7 will be cut down to 2 days using my aging 2687w Xeons, and the newer high end V4 Xeons are twice as fast as my old set up, so if I have a client wants the renders done in 2 days for a presentation, no problems, but a single socket i7 could mean delays.

If I upgrade to TXP's I will definitely water cool them. Unlike gaming where the load is distributed equally in SLI giving fairly low temps 50/60c, when rendering in Iray the GPUs are at a constant 100% load until the render completes. With 4x Titans that could be anything from 8 minutes to 30 minutes depending on the complexity/materials in scenes for a noise free render. So they get hot fast, and its already proven that the TXP stock cooler is not up to the job of keeping temps bellow to 70c mark at a constant 100% load, for longer life. Titans are proven to fail in render farms unless their temps are managed well, and kept low. I have had to ramp up the fans of the GPU's to 85% before render, and open windows sometimes in the summer, to keep the temps down, even though I have a huge case with industrial fans. So defiantly going all liquid the next build for a quieter life. If it were just for gaming I would not bother, but this systems priority is production rendering, gaming if I get the time..

I'm fairly confident now that I can just drop in 4x TXP's without issue, just would have been nice to take advantage of the HB bridge for gaming and leave the remaining 2 cards non SLI for GPGPU work.

I bought the Antec HCP PSU's because they communicate the load balance between the two PSU's. I have read and experience myself PSU failure just doing the wired hack to make 2 PSU's turn on at the same time, without the load balance communication running 4 high end GPU's, it eventually kills the PSU's.

Did the HB bridge make a difference to performance?

sorry for the late reply got overwhelmed.. sure do get the 2slot yes it does make a difference it depends on the game/settings/etc its more than just higher bandwidth its wired differently than 2x flexy legacy it forces the gpus to run at the same voltages.. at the moment i am enjoying witcher3 blood and wine at 4k/hdr/max settings/high quality drivers settings plus a bunch of mods easy breezy for titan xp sli @ 90-130fps
post #2414 of 2425
HI!
I've this nice board from a while....it works perfectly excpet the usb ports...
I've a strange problem because sometimes, the usb port, won't recognize any external disk or usb memory stick...
Thanks for the help
Arian
     
CPUMotherboardGraphicsGraphics
Xeon E5-2695 V3 Asus X99-S Quadro M4000 GTX 980ti 
RAMHard DriveHard DriveHard Drive
G_Skill DDR4_4x8 Samsung 850Pro Samsung 850Pro Samsung 840Pro 
Hard DriveHard DriveHard DriveOptical Drive
OCZ Vertex3 WD_Blue WD_Blue BD/DVD rewriter LG 
CoolingOSMonitorMonitor
Noctua 14 Windows 7 Pro Asus PA279 Asus PA279 
KeyboardPowerCaseMouse
Logitech K120 US layout Seasonic Platinum 860W Fractal design R5 Logitech M500 
CPUCPUMotherboardGraphics
Xeon E5-2670 Xeon E5-2670 Asus Z9PED8 WS GTX 1050 2GB 
RAMHard DriveHard DriveOptical Drive
64gb DDR3 (8x8gb) Samsung 850 EVO 256GB Seagate 320GB ASUS DVD rewriter 
CoolingCoolingOSPower
Noctua NH-U12S Noctua NH-U12S Windows 7 Pro Seasonic Platinum 750W 
Case
Enthoo Pro 
  hide details  
Reply
     
CPUMotherboardGraphicsGraphics
Xeon E5-2695 V3 Asus X99-S Quadro M4000 GTX 980ti 
RAMHard DriveHard DriveHard Drive
G_Skill DDR4_4x8 Samsung 850Pro Samsung 850Pro Samsung 840Pro 
Hard DriveHard DriveHard DriveOptical Drive
OCZ Vertex3 WD_Blue WD_Blue BD/DVD rewriter LG 
CoolingOSMonitorMonitor
Noctua 14 Windows 7 Pro Asus PA279 Asus PA279 
KeyboardPowerCaseMouse
Logitech K120 US layout Seasonic Platinum 860W Fractal design R5 Logitech M500 
CPUCPUMotherboardGraphics
Xeon E5-2670 Xeon E5-2670 Asus Z9PED8 WS GTX 1050 2GB 
RAMHard DriveHard DriveOptical Drive
64gb DDR3 (8x8gb) Samsung 850 EVO 256GB Seagate 320GB ASUS DVD rewriter 
CoolingCoolingOSPower
Noctua NH-U12S Noctua NH-U12S Windows 7 Pro Seasonic Platinum 750W 
Case
Enthoo Pro 
  hide details  
Reply
post #2415 of 2425
hi everyone!

my Z9PE won't post since yesterday... I changed a hard drive on my build, and when I wanted to restart it wouldnt post. no image nothing, I get the bc error... I cleared the cmos, removed all the gpus, only kep 2 sticks of ram... nothing!

please help! should I buy a new bios chip? or is this something else?

everything was running so smoothly when this happened... I don't get it mad.gif
post #2416 of 2425
post #2417 of 2425
I had that exact same problem last week and it was a bad ram stick on mine.

I dropped down to 1 stick of ram in A1 (even though I have 2 cpu's it worked with just 1 stick).

I then used the memory chart in the manual for adding the ram back 1 at a time until the boot got stuck at 6c again.

I was then able to add all 15 remaining sticks of ram and it booted fine every time as I added them. I just got the replacement stick in today and it is now back to a full 256gb.

If you still fail with 1 stick, try another stick, you might have the bad one in.


That is what worked for me with that error and mine also started after a power off. Would not come up again until I pulled the ram and found the bad one.
post #2418 of 2425
Quote:
Originally Posted by keun View Post

hi everyone!

my Z9PE won't post since yesterday... I changed a hard drive on my build, and when I wanted to restart it wouldnt post. no image nothing, I get the bc error... I cleared the cmos, removed all the gpus, only kep 2 sticks of ram... nothing!

please help! should I buy a new bios chip? or is this something else?

everything was running so smoothly when this happened... I don't get it mad.gif
Yeah the motherboard can be a bit temperamental. If I were you I would strip it down also removing both CPU's then rebuild. I know it's a pain but this is what I had to do when I a non-start issue due to overclocking.
post #2419 of 2425
SATA HDD on the SCU not mounting
I can't seem to figure out why SATA drives that show up in the SCU section of the BIOS are not mounting in Windows
They work in the other SATA ports so I guess it is a driver issue. They are not SAS.
Any ideas?
I'm not sure how to update drivers as the downloads have lots and nothing happens when you try to run them
Tris
post #2420 of 2425
Quote:
Originally Posted by sand74 View Post

All bios are on asus support page!!

Go to System - DOS wink.gif

There are 17 bios files.

Sigh, yes I was hoping for a 590* update after reading your message, sigh.
Workstation 01
(18 items)
 
  
CPUCPUMotherboardGraphics
Intel Xeon E5 2650 v2 Intel Xeon E5 2650 v2 Z9PE-D8 WS NVIDIA GeForce GTX 1080 Ti 
GraphicsRAMHard DriveHard Drive
NVIDIA GeForce GTX 980 Kingston ECC 196 Gb Samsung EVO SSD RAID 10 (4 x 1Tb Samsung EVO's) Seagate 3 Tb 7200rpm 
Hard DriveHard DriveOptical DriveOS
Seagate 3 Tb 7200rpm Seagate 3 Tb 7200rpm LG BlueRay RW Windows 10 pro 64bit 
MonitorMonitorKeyboardPower
Samsung 4K UHD 50" ASUS 4K UHD 28" SteelSeries Mechanical Kybd CoolerMaster M2 SilentPro1500 w 
CaseAudio
Corsair Carbide Series Full Tower  SoundBlaster X-Fi Extreme Audio PCIE 
  hide details  
Reply
Workstation 01
(18 items)
 
  
CPUCPUMotherboardGraphics
Intel Xeon E5 2650 v2 Intel Xeon E5 2650 v2 Z9PE-D8 WS NVIDIA GeForce GTX 1080 Ti 
GraphicsRAMHard DriveHard Drive
NVIDIA GeForce GTX 980 Kingston ECC 196 Gb Samsung EVO SSD RAID 10 (4 x 1Tb Samsung EVO's) Seagate 3 Tb 7200rpm 
Hard DriveHard DriveOptical DriveOS
Seagate 3 Tb 7200rpm Seagate 3 Tb 7200rpm LG BlueRay RW Windows 10 pro 64bit 
MonitorMonitorKeyboardPower
Samsung 4K UHD 50" ASUS 4K UHD 28" SteelSeries Mechanical Kybd CoolerMaster M2 SilentPro1500 w 
CaseAudio
Corsair Carbide Series Full Tower  SoundBlaster X-Fi Extreme Audio PCIE 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Intel Motherboards
Overclock.net › Forums › Intel › Intel Motherboards › Asus Z9PE-D8 Owner's thread