Overclock.net › Forums › Intel › Intel CPUs › Broadwell-E thread
New Posts  All Forums:Forum Nav:

Broadwell-E thread - Page 315

post #3141 of 7219
Quote:
Originally Posted by DarkIdeals View Post

Agreed, definitely a nice setup Cookies! thumb.gif


You gotta admit that the 6950X keeps up pretty darn well though! The Dual E5 2683 V3 setup even with 28 cores / 56 threads at 3ghz is only 35% faster in the results than the 4.4ghz 6950X (12.8fps vs 9.44fps = ~35.5% difference)

Honestly the main thing that tempted me to go dual socket is the extra PCI-e lanes; You could have a full 4 way GPU setup with x16 bandwidth on all four cards and still have enough 3.0 lanes left for an NVMe based RAID array, a 10Gb NIC, etc..etc.. Boards like the Z10PE-D8 are great for that kinda thing, has M.2 Support so you could pop in one of the new 1TB/2TB M.2 NVMe drives then run a RAID'ed setup with two 1.2TB 750 Series in the slots and still have enough lanes for another x4 3.0 card.

In the end though i couldn't bring myself to go dual socket BW-EP when Skylake-X and Skylake-EP are only a year-ish ahead; although i'm a little miffed at the latest rumors saying the 48 pci-e lane rumor is false and we're limited to 44 now; was hoping for full 3 way x16 capability on single socket.
It really, really depends on your usage.

I have big cpu/memory apps I crunch through many times a day that even limited to 10 threads my 2x2690 setup topping out at 3.2GHz all-core matches or exceeds the performance of my 6950x at 4.4GHz. If I open up the whole can and give it all cores, then it slaps the 6950x around.

One big difference is ~120GB/s+ memory bandwidth of 2xXeon chips vs 60/70/80GB/s of a BWE chip with an awesome memory tune.

That said, if I want to plow through a database with one thread, then 4.4GHz on a BWE is the way to go.

As always the answer to "which one?" is "yes!" thumb.gif
post #3142 of 7219
CPUZ Reports the same core voltage - 1.367.
post #3143 of 7219
Quote:
Originally Posted by DarkIdeals View Post

Agreed, definitely a nice setup Cookies! thumb.gif


You gotta admit that the 6950X keeps up pretty darn well though! The Dual E5 2683 V3 setup even with 28 cores / 56 threads at 3ghz is only 35% faster in the results than the 4.4ghz 6950X (12.8fps vs 9.44fps = ~35.5% difference)

Honestly the main thing that tempted me to go dual socket is the extra PCI-e lanes; You could have a full 4 way GPU setup with x16 bandwidth on all four cards and still have enough 3.0 lanes left for an NVMe based RAID array, a 10Gb NIC, etc..etc.. Boards like the Z10PE-D8 are great for that kinda thing, has M.2 Support so you could pop in one of the new 1TB/2TB M.2 NVMe drives then run a RAID'ed setup with two 1.2TB 750 Series in the slots and still have enough lanes for another x4 3.0 card.

In the end though i couldn't bring myself to go dual socket BW-EP when Skylake-X and Skylake-EP are only a year-ish ahead; although i'm a little miffed at the latest rumors saying the 48 pci-e lane rumor is false and we're limited to 44 now; was hoping for full 3 way x16 capability on single socket.

Honestly in most cases I think the 6950X will slap these dual CPU nodes around. Two of them were supposed to go towards remote Adobe rendering, but I think they will just get re-purposed into VM hosts, not having GPU's really suck. That said, as VM hosts, these are amazing. Only pulling roughly 350watts under full load, these supermicro fattwin nodes are extremely efficient. Only 12V coming from the power backplane, and the 5V/3.3V conversion is handled all on each node individually.

The downside of a dual CPU system is NUMA awareness. It's really important to place your lanes & devices on the right CPU depending on your workload. While the CPU -> CPU bandwidth with QPI & DMI 2.0 isn't that bad, the latency hit for certain applications is still noticeable.

For example the X265 benchmark isn't full NUMA aware it seems like. The 2nd CPU seems to get much less load than the first CPU in certain cases. Really need to dig down into this and see if I can maybe squeeze a few more MHZ out of it, the memory latency hit is also there as well. I'm not on a Z10 or Supermicro Hyper-drive board.
2015 X99 Build
(20 items)
 
   
CPUMotherboardGraphicsGraphics
Intel i7 5960X Rampage V Extreme EVGA GTX980Ti KPE EVGA GTX980Ti KPE 
RAMHard DriveCoolingCooling
Corsair Dominator Platinum 64GB 2666C15 @ 2666C13 Intel SSD 750 400GB Custom water loop 2x EK PE 360mm 
CoolingCoolingCoolingCooling
EK XE 480mm EK Supremacy EVO 2x EK Supremacy VGA EK X-RES / D5 PWM 
CoolingMonitorKeyboardPower
EK Vardar F3-120/F4-120ER all around 2x Dell U2713  Custom Sprit PCB / Poker II MX Clear Corsair AX1200 Individually sleeved 
CaseMouseAudioAudio
Thermaltake X9 Snow edition Logitech G502 / Razer Mamba 2015 Schiit Bifrost Uber USB Schiit Magni 2 Uber 
CPUMotherboardGraphicsGraphics
Intel 980X Rampage III Extreme EVGA GTX580 EVGA GTX580 
RAMRAMHard DriveCooling
Corsair Dominator-GT 3x2GB DDR3-2000 C8 Corsair Vengeance 6x4GB Crucial M4 128GB Custom loop. 
MonitorKeyboardPowerCase
HP ZR2740W Razer Blackwidow Ultimate 2013 Stealth Corsair AX1200 Corsair 800D 
MouseMouse PadAudio
Razer Mamba 2012 Razer Invicta Xonar Xense 
  hide details  
Reply
2015 X99 Build
(20 items)
 
   
CPUMotherboardGraphicsGraphics
Intel i7 5960X Rampage V Extreme EVGA GTX980Ti KPE EVGA GTX980Ti KPE 
RAMHard DriveCoolingCooling
Corsair Dominator Platinum 64GB 2666C15 @ 2666C13 Intel SSD 750 400GB Custom water loop 2x EK PE 360mm 
CoolingCoolingCoolingCooling
EK XE 480mm EK Supremacy EVO 2x EK Supremacy VGA EK X-RES / D5 PWM 
CoolingMonitorKeyboardPower
EK Vardar F3-120/F4-120ER all around 2x Dell U2713  Custom Sprit PCB / Poker II MX Clear Corsair AX1200 Individually sleeved 
CaseMouseAudioAudio
Thermaltake X9 Snow edition Logitech G502 / Razer Mamba 2015 Schiit Bifrost Uber USB Schiit Magni 2 Uber 
CPUMotherboardGraphicsGraphics
Intel 980X Rampage III Extreme EVGA GTX580 EVGA GTX580 
RAMRAMHard DriveCooling
Corsair Dominator-GT 3x2GB DDR3-2000 C8 Corsair Vengeance 6x4GB Crucial M4 128GB Custom loop. 
MonitorKeyboardPowerCase
HP ZR2740W Razer Blackwidow Ultimate 2013 Stealth Corsair AX1200 Corsair 800D 
MouseMouse PadAudio
Razer Mamba 2012 Razer Invicta Xonar Xense 
  hide details  
Reply
post #3144 of 7219
Quote:
Originally Posted by Martin778 View Post

CPUZ Reports the same core voltage - 1.367.
on x99 cpuZ reports VID, not vcore. Use AID 64.
Edited by Jpmboy - 9/26/16 at 5:53am
x99
(22 items)
 
Z370
(8 items)
 
x299
(18 items)
 
CPUCPUMotherboardGraphics
i7 5960X i7 6950X Asus Rampage V Extreme 10 2 GTX Titan X Pascal 
RAMHard DriveHard DriveHard Drive
GSkill 3200c14 TZs 8x8GB @ 3400c13 Intel 750 NVMe 400GB 2x Plextor SSD 256 Raid 0 (Win 7) Samsung NVMe 950 Pro M.2 
Optical DriveCoolingCoolingCooling
Plextor 810 2x XSPC RX360 Koolance 380i D5 
CoolingOSOSOS
Aquaero 6 Windows 10 x64  Windows 7 Pro Linux Mint 
MonitorKeyboardPowerCase
Seiki 50" 4K monitor (720P-2160i) 240-30Hz Das Keyboard Model S Pro Corsair AX1200 Case Labs SM8 
MouseAudio
Steelseries Rival Ultimate Ears TF10 
CPUMotherboardGraphicsRAM
8700K ASUS Maximus X Apex GTX 1080 2x8 GB G.Skill 4400c19 
Hard DriveCoolingOSCase
Samsung 960 Pro 360Rad + Chiller Win 10 HWBOT OBT 
CPUCPUMotherboardGraphics
7740X 7980XE ASUS Rampage Vi Apex 2x Nvidia Titan Xp SLi 
RAMHard DriveHard DriveHard Drive
G.Skill 3600c15 2x8GB kits Samsung 960 Pro, NVMe WD Raptor 1T Plextor SSDs Raid 0 
Optical DriveCoolingOSOS
Plextor DVD/BR R/W (very) Custom Water Windows 10 Pro Windows 7 Pro 
MonitorKeyboardPowerCase
ASUS 144Hz Ducky Corsair 1500i Microcool Benchetto 101 
MouseAudio
Steelseries Rival Ultimate ears TF10s (IEMs) 
  hide details  
Reply
x99
(22 items)
 
Z370
(8 items)
 
x299
(18 items)
 
CPUCPUMotherboardGraphics
i7 5960X i7 6950X Asus Rampage V Extreme 10 2 GTX Titan X Pascal 
RAMHard DriveHard DriveHard Drive
GSkill 3200c14 TZs 8x8GB @ 3400c13 Intel 750 NVMe 400GB 2x Plextor SSD 256 Raid 0 (Win 7) Samsung NVMe 950 Pro M.2 
Optical DriveCoolingCoolingCooling
Plextor 810 2x XSPC RX360 Koolance 380i D5 
CoolingOSOSOS
Aquaero 6 Windows 10 x64  Windows 7 Pro Linux Mint 
MonitorKeyboardPowerCase
Seiki 50" 4K monitor (720P-2160i) 240-30Hz Das Keyboard Model S Pro Corsair AX1200 Case Labs SM8 
MouseAudio
Steelseries Rival Ultimate Ears TF10 
CPUMotherboardGraphicsRAM
8700K ASUS Maximus X Apex GTX 1080 2x8 GB G.Skill 4400c19 
Hard DriveCoolingOSCase
Samsung 960 Pro 360Rad + Chiller Win 10 HWBOT OBT 
CPUCPUMotherboardGraphics
7740X 7980XE ASUS Rampage Vi Apex 2x Nvidia Titan Xp SLi 
RAMHard DriveHard DriveHard Drive
G.Skill 3600c15 2x8GB kits Samsung 960 Pro, NVMe WD Raptor 1T Plextor SSDs Raid 0 
Optical DriveCoolingOSOS
Plextor DVD/BR R/W (very) Custom Water Windows 10 Pro Windows 7 Pro 
MonitorKeyboardPowerCase
ASUS 144Hz Ducky Corsair 1500i Microcool Benchetto 101 
MouseAudio
Steelseries Rival Ultimate ears TF10s (IEMs) 
  hide details  
Reply
post #3145 of 7219
Quote:
Originally Posted by cookiesowns View Post

Honestly in most cases I think the 6950X will slap these dual CPU nodes around. Two of them were supposed to go towards remote Adobe rendering, but I think they will just get re-purposed into VM hosts, not having GPU's really suck. That said, as VM hosts, these are amazing. Only pulling roughly 350watts under full load, these supermicro fattwin nodes are extremely efficient. Only 12V coming from the power backplane, and the 5V/3.3V conversion is handled all on each node individually.

The downside of a dual CPU system is NUMA awareness. It's really important to place your lanes & devices on the right CPU depending on your workload. While the CPU -> CPU bandwidth with QPI & DMI 2.0 isn't that bad, the latency hit for certain applications is still noticeable.

For example the X265 benchmark isn't full NUMA aware it seems like. The 2nd CPU seems to get much less load than the first CPU in certain cases. Really need to dig down into this and see if I can maybe squeeze a few more MHZ out of it, the memory latency hit is also there as well. I'm not on a Z10 or Supermicro Hyper-drive board.

Yeah, all true. NUMA problems, along with being (seemingly purposely) stuck with a gimped 2.0 DMI where Z170 runs wild with 3.0 etc.. are the downfall of dual socket setups (well that and the possible necessity of being forced onto Server 2012 instead of Linux or Win7/8.1/10 on some of the higher core count setups)

I've actually seen some interesting things that can come from dual socket setups though; for example, through careful management of what devices are on which CPU like you said, you can actually trick the drivers and OS etc.. in some cases, into letting restrictions down. Like having four way SLI Pascal GPU's is "impossible" except for benchmarking right? Wrong. With a dual socket setup you can have four GPU's in SLI with a four way bridge (has to be one of the hard LED ones with shielded SLI connectors) but with the first two cards in slots controlled by CPU 0 and the 3rd + 4th GPU in slots controlled by CPU 1. This makes it especially easy to bypass the Nvidia driver warning in NV contgrol panel on Pascal cards that says "a higher performing bridge could give better performance" etc.. (thus why you use a hard LED bridge, which in combination with these tweaks tricks the Nvidia drivers into fully "unlocking" SLI functionality by making the system think your 3 or 4 way normal bridge is just TWO "high bandwidth" bridges connecting two SETS of cards. In other words it thinks you have two way SLI of card 1 and 2 on CPU0 and two way SLI of card 3 and 4 on CPU1 and thus unlocks all limitations, but since CPU0 and CPU1 can still talk to each other you get true 4 way SLI on pascal, despite Nvidia claiming that 4 way SLI was impossible...it isn't, they just locked it away with drivers for some odd reason).

The same can seemingly be done on a single socket, as Baasha managed to do it, but he's being a bit uptight and refusing to share how he did it lol. All i know is that you have to do a lot of tinkering with custom SLI profiles from scratch and editing drivers etc.. overall though a dual socket setup is best not only for the ease of getting it to work but the fact that four way TITAN XP setups (or even 4 way 1080 to some degree) just really REQUIRE x16 lanes per card to be at their full potential with all the power being put out. I've seen setups like that with 4 way TITAN XP giving games like BF4 getting NEARLY THREE HUNDRED FPS at 4K resolution on a dual E5 2699 V4 setup despite only running at a ~3ghz speed; whereas on a single socket i7 there was a fairly significant drop in GPU usage on the 3rd and 4th card and in turn significantly lower fps (closer to 200) so again, those 80 PCI-e lanes can definitely come in handy at times!

There's also interesting tweaks to specifically allocate (more like force i suppose) more than the typical max of 4-6 cores to games, which is also easier to achieve on dual socket boards


Quote:
Originally Posted by cekim View Post

It really, really depends on your usage.

I have big cpu/memory apps I crunch through many times a day that even limited to 10 threads my 2x2690 setup topping out at 3.2GHz all-core matches or exceeds the performance of my 6950x at 4.4GHz. If I open up the whole can and give it all cores, then it slaps the 6950x around.

One big difference is ~120GB/s+ memory bandwidth of 2xXeon chips vs 60/70/80GB/s of a BWE chip with an awesome memory tune.

That said, if I want to plow through a database with one thread, then 4.4GHz on a BWE is the way to go.

As always the answer to "which one?" is "yes!" thumb.gif


Well yeah, if you have a legitimate need for massive memory bandwidth then a dual chip setup will STOMP anything a single socket can provide with ease. Hell, i've seen a few specific cases where one of the recently popular budget dual E5 2670 V1 Setups on an ASUS Z9PE-D8 (That's two 8 core 16 thread Sandy Bridge-EP chips used for only ~$70 a piece, and you can throw in the C600 chipset board and ~128GB of ECC DDR3 with a compatible case for only ~$500 TOTAL cost) actually ties or BEATS out a 6950X (mostly specific Transcoding, Encoding, etc.. loads that either A) Take advantage of the ~90GB/s bandwidth that a dual socket setup with quad channel DDR3 provides. and/or B) Has proper NUMA support that you can efficiently and quickly use pretty much all 16 cores, with the 2nd socket performing near or at the level of the first etc..

In many cases it was either ~10-15% ahead of a 6950X, roughly tied with it, or ~10-15% behind the 6950X (although i think the 6950X in this case was at maybe 3.8 - 4ghz at most, if that. Still impressive though)


The workloads that i'm working with are mostly things like training Deep Neural Nets, general video editing and encoding, etc.. and moderate/heavy gaming and web browsing and so forth in spare time. Which is why i ultimately have leaned towards just using the 6950X overclocked as high as i can get it, on the Rampage V Edition 10, with 64GB of high speed DDR4 instead of going for a dual socket C612 setup with two 10-12 core Xeons, and 128GB of slower speed ~2133-2400mhz ECC Reg. or something.
Edited by DarkIdeals - 9/25/16 at 11:15pm
 
CPUMotherboardGraphicsRAM
i7 5960X ASUS Rampage V Extreme MSI GTX 1080 Gaming X 16GB Corsair Dominator Platinum 2666mhz DDR4 C15 
Hard DriveHard DriveOptical DriveCooling
PNY CS2211 MLC SATA III SSD WD Blue 500GB 7200rpm Writemaster DVD/CD +/- RW EK Supremacy EVO 
CoolingCoolingCoolingCooling
Mayhem's Pastel Ice White Coolant Coollaboratory Liquid Copper TIM XSPC EX480mm Radiator Black Ice GTX 360mm Radiator 
CoolingCoolingCoolingCooling
EK XTC 420mm Radiator Swiftech MCP655-B 12v Pump EK RES X3 250 Reservoir Bitspower/EK 3/8" x 1/2" Compression fitting 
OSMonitorMonitorKeyboard
Windows 8.1 ASUS ROG Swift PG278Q 27" 1440p 144hz G-Sync ASUS VG23AH 23.5" Passive 3d 1080p 60hz  Razer Blackwidow Chroma Tournament Edition 
PowerCaseMouseMouse Pad
EVGA Supernova G2 1000 Caselabs SMA8 -XXL Window, Ventilated sides/t... Razer Ouroboros Norman Rockwell collection series print Mousepad 
AudioAudio
Sennheiser HD700 300ohm Open Back Headphones SupremeFX Hi-Fi 5.25" Bay AMP/DAC  
  hide details  
Reply
 
CPUMotherboardGraphicsRAM
i7 5960X ASUS Rampage V Extreme MSI GTX 1080 Gaming X 16GB Corsair Dominator Platinum 2666mhz DDR4 C15 
Hard DriveHard DriveOptical DriveCooling
PNY CS2211 MLC SATA III SSD WD Blue 500GB 7200rpm Writemaster DVD/CD +/- RW EK Supremacy EVO 
CoolingCoolingCoolingCooling
Mayhem's Pastel Ice White Coolant Coollaboratory Liquid Copper TIM XSPC EX480mm Radiator Black Ice GTX 360mm Radiator 
CoolingCoolingCoolingCooling
EK XTC 420mm Radiator Swiftech MCP655-B 12v Pump EK RES X3 250 Reservoir Bitspower/EK 3/8" x 1/2" Compression fitting 
OSMonitorMonitorKeyboard
Windows 8.1 ASUS ROG Swift PG278Q 27" 1440p 144hz G-Sync ASUS VG23AH 23.5" Passive 3d 1080p 60hz  Razer Blackwidow Chroma Tournament Edition 
PowerCaseMouseMouse Pad
EVGA Supernova G2 1000 Caselabs SMA8 -XXL Window, Ventilated sides/t... Razer Ouroboros Norman Rockwell collection series print Mousepad 
AudioAudio
Sennheiser HD700 300ohm Open Back Headphones SupremeFX Hi-Fi 5.25" Bay AMP/DAC  
  hide details  
Reply
post #3146 of 7219
Quote:
Originally Posted by DarkIdeals View Post

Yeah, all true. NUMA problems, along with being (seemingly purposely) stuck with a gimped 2.0 DMI where Z170 runs wild with 3.0 etc.. are the downfall of dual socket setups (well that and the possible necessity of being forced onto Server 2012 instead of Linux or Win7/8.1/10 on some of the higher core count setups)

I've actually seen some interesting things that can come from dual socket setups though; for example, through careful management of what devices are on which CPU like you said, you can actually trick the drivers and OS etc.. in some cases, into letting restrictions down. Like having four way SLI Pascal GPU's is "impossible" except for benchmarking right? Wrong. With a dual socket setup you can have four GPU's in SLI with a four way bridge (has to be one of the hard LED ones with shielded SLI connectors) but with the first two cards in slots controlled by CPU 0 and the 3rd + 4th GPU in slots controlled by CPU 1. This makes it especially easy to bypass the Nvidia driver warning in NV contgrol panel on Pascal cards that says "a higher performing bridge could give better performance" etc.. (thus why you use a hard LED bridge, which in combination with these tweaks tricks the Nvidia drivers into fully "unlocking" SLI functionality by making the system think your 3 or 4 way normal bridge is just TWO "high bandwidth" bridges connecting two SETS of cards. In other words it thinks you have two way SLI of card 1 and 2 on CPU0 and two way SLI of card 3 and 4 on CPU1 and thus unlocks all limitations, but since CPU0 and CPU1 can still talk to each other you get true 4 way SLI on pascal, despite Nvidia claiming that 4 way SLI was impossible...it isn't, they just locked it away with drivers for some odd reason).

The same can seemingly be done on a single socket, as Baasha managed to do it, but he's being a bit uptight and refusing to share how he did it lol. All i know is that you have to do a lot of tinkering with custom SLI profiles from scratch and editing drivers etc.. overall though a dual socket setup is best not only for the ease of getting it to work but the fact that four way TITAN XP setups (or even 4 way 1080 to some degree) just really REQUIRE x16 lanes per card to be at their full potential with all the power being put out. I've seen setups like that with 4 way TITAN XP giving games like BF4 getting NEARLY THREE HUNDRED FPS at 4K resolution on a dual E5 2699 V4 setup despite only running at a ~3ghz speed; whereas on a single socket i7 there was a fairly significant drop in GPU usage on the 3rd and 4th card and in turn significantly lower fps (closer to 200) so again, those 80 PCI-e lanes can definitely come in handy at times!

There's also interesting tweaks to specifically allocate (more like force i suppose) more than the typical max of 4-6 cores to games, which is also easier to achieve on dual socket boards
Well yeah, if you have a legitimate need for massive memory bandwidth then a dual chip setup will STOMP anything a single socket can provide with ease. Hell, i've seen a few specific cases where one of the recently popular budget dual E5 2670 V1 Setups on an ASUS Z9PE-D8 (That's two 8 core 16 thread Sandy Bridge-EP chips used for only ~$70 a piece, and you can throw in the C600 chipset board and ~128GB of ECC DDR3 with a compatible case for only ~$500 TOTAL cost) actually ties or BEATS out a 6950X (mostly specific Transcoding, Encoding, etc.. loads that either A) Take advantage of the ~90GB/s bandwidth that a dual socket setup with quad channel DDR3 provides. and/or B) Has proper NUMA support that you can efficiently and quickly use pretty much all 16 cores, with the 2nd socket performing near or at the level of the first etc..

In many cases it was either ~10-15% ahead of a 6950X, roughly tied with it, or ~10-15% behind the 6950X (although i think the 6950X in this case was at maybe 3.8 - 4ghz at most, if that. Still impressive though)


The workloads that i'm working with are mostly things like training Deep Neural Nets, general video editing and encoding, etc.. and moderate/heavy gaming and web browsing and so forth in spare time. Which is why i ultimately have leaned towards just using the 6950X overclocked as high as i can get it, on the Rampage V Edition 10, with 64GB of high speed DDR4 instead of going for a dual socket C612 setup with two 10-12 core Xeons, and 128GB of slower speed ~2133-2400mhz ECC Reg. or something.

Figure out what's wrong with your system yet?
Edited by Silent Scone - 9/26/16 at 12:06am
post #3147 of 7219
Quote:
Originally Posted by Silent Scone View Post

Figure out what's wrong with your system yet?


Nope thumbsdownsmileyanim.gif

I was wanting to avoid buying a new block since it would go to waste when i eventually buy the monoblock; but since i have no damn clue when Amazon will decide to finally refund my money and i won't get paid for another couple weeks i may have to buy one just to be able to properly use the system. Unfortunately there's no guarantee that a new block would fix anything either though; as it could very well be something else like the pump etc..

I managed to get my temps a LITTLE lower, with the max core temp during Cinebench sitting around ~83-84C instead of ~87-88C like it was before; and sometimes after booting from cold it'll stay down to ~79C or something for max but overall it's still a problem.

I took the block apart again and noticed that i had forgotten to put the insert back in so that likely explains the ~4-5C max temp drop though; plus i scrubbed it again to try and clean out any more gunk i could find. I managed to clean all but some oxidation on the fins and i'm not sure if that square inch or so of oxidation could actually be causing that big of a temperature problem or not. I'm thinking of trying to vinegar flush the loop, but with a nickel block you're apparently not supposed to do that so i'm at a loss here of what to do really. I can't backflush the loop since i only have one pump too! ugh....
 
CPUMotherboardGraphicsRAM
i7 5960X ASUS Rampage V Extreme MSI GTX 1080 Gaming X 16GB Corsair Dominator Platinum 2666mhz DDR4 C15 
Hard DriveHard DriveOptical DriveCooling
PNY CS2211 MLC SATA III SSD WD Blue 500GB 7200rpm Writemaster DVD/CD +/- RW EK Supremacy EVO 
CoolingCoolingCoolingCooling
Mayhem's Pastel Ice White Coolant Coollaboratory Liquid Copper TIM XSPC EX480mm Radiator Black Ice GTX 360mm Radiator 
CoolingCoolingCoolingCooling
EK XTC 420mm Radiator Swiftech MCP655-B 12v Pump EK RES X3 250 Reservoir Bitspower/EK 3/8" x 1/2" Compression fitting 
OSMonitorMonitorKeyboard
Windows 8.1 ASUS ROG Swift PG278Q 27" 1440p 144hz G-Sync ASUS VG23AH 23.5" Passive 3d 1080p 60hz  Razer Blackwidow Chroma Tournament Edition 
PowerCaseMouseMouse Pad
EVGA Supernova G2 1000 Caselabs SMA8 -XXL Window, Ventilated sides/t... Razer Ouroboros Norman Rockwell collection series print Mousepad 
AudioAudio
Sennheiser HD700 300ohm Open Back Headphones SupremeFX Hi-Fi 5.25" Bay AMP/DAC  
  hide details  
Reply
 
CPUMotherboardGraphicsRAM
i7 5960X ASUS Rampage V Extreme MSI GTX 1080 Gaming X 16GB Corsair Dominator Platinum 2666mhz DDR4 C15 
Hard DriveHard DriveOptical DriveCooling
PNY CS2211 MLC SATA III SSD WD Blue 500GB 7200rpm Writemaster DVD/CD +/- RW EK Supremacy EVO 
CoolingCoolingCoolingCooling
Mayhem's Pastel Ice White Coolant Coollaboratory Liquid Copper TIM XSPC EX480mm Radiator Black Ice GTX 360mm Radiator 
CoolingCoolingCoolingCooling
EK XTC 420mm Radiator Swiftech MCP655-B 12v Pump EK RES X3 250 Reservoir Bitspower/EK 3/8" x 1/2" Compression fitting 
OSMonitorMonitorKeyboard
Windows 8.1 ASUS ROG Swift PG278Q 27" 1440p 144hz G-Sync ASUS VG23AH 23.5" Passive 3d 1080p 60hz  Razer Blackwidow Chroma Tournament Edition 
PowerCaseMouseMouse Pad
EVGA Supernova G2 1000 Caselabs SMA8 -XXL Window, Ventilated sides/t... Razer Ouroboros Norman Rockwell collection series print Mousepad 
AudioAudio
Sennheiser HD700 300ohm Open Back Headphones SupremeFX Hi-Fi 5.25" Bay AMP/DAC  
  hide details  
Reply
post #3148 of 7219
Finally got my new RAM kit today - 4x8GB TridentZ 3200MHz CL14. Currently running @ 1,38V and CR1. Seems like RAM doens't do much for x265.
My score now raised to 10.14FPS.
post #3149 of 7219
Quote:
Originally Posted by Jpmboy View Post

ahhh - those 48 thread chips are just sick! drool.gif

How about 64? biggrin.gif


Quote:
Originally Posted by Martin778 View Post

Finally got my new RAM kit today - 4x8GB TridentZ 3200MHz CL14. Currently running @ 1,38V and CR1. Seems like RAM doens't do much for x265.
My score now raised to 10.14FPS.

Can you run memory higher than 3200? I got Corsair Dominators rated at 3466, but my 6850 cant run it higher than 3200
Edited by rioja - 9/26/16 at 2:54pm
post #3150 of 7219
3400MHz @ 14-14-14-14 @ 1,39V crashed within 2 minutes of RB. No problems with XMP settings (3200@C14) though, passed 1h RB with no problems.

They doesn't seem to like CR1 though.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Intel CPUs
Overclock.net › Forums › Intel › Intel CPUs › Broadwell-E thread