Overclock.net banner

1 - 20 of 22 Posts

·
Registered
Joined
·
12 Posts
Discussion Starter · #1 ·
Pretty much as it says, My old tower Served me for nearly 6 years always good, keped up with a bit of love and care for modern games as well. But PUFF, bit of blue smoke, and now its gone.
And here is my problem, build old one surrounded by college friends (Computer Science), but i dropped out like a dumb ass i am and did not keep up with hard ware.
So now, using laptop, humbly ask for help to find and put together new tower, for purposes of computer gaming and media and some specs will follow:

I use Two monitors, and I really mean use them. I could be playing latest game on highest settings on one of them and watching HD movie on the other. Don't want either one to lag really =3
For GPU, I'd prefer to look over to AMD. While nvidia served me with my last rig, all 3 'big' consoles at teh moment rocking AMD GPUs. So in few years, games released for PC + at least 1 other console probably would be better optimized for AMD.
Hard drives, at least 1 SSD (while windows is still a must, I think I'd like to experiment with steam OS, so its either one SSD partitioned or 2 separate) and 2 other drives for games and media respectively. My old hard drives survived, but they were bad in the first place, i bought them for cheap capacity at the time, so might recycle them and what ever else I can salvage in old tower into a home Server later.

Last point is money. I will have in about 3 weeks £1500 to spend on this. That's just tower thou - I do have Monitors, and other peripherals that were upgraded and I'd like to keep. FOR EU - that's around 1800 euro, for USA that would be around $2500, but considering prices here on hard ware more expensive, consider that more like $2100

And now i would like to thank you for reading, and looking for ward to reading yur posts =3

Also, feel free to ask any questions. Cheers!
 

·
Registered
Joined
·
3,597 Posts
Well I understand your reason for wanting AMD but for gaming all the AMD GPU's are being sold out or marked up because of miners. Example r9 290 Manufactures Retail price is $399.99 but it might be hard finding one less than $550. I would go with Nvidia for now.

SSD = Samsung

CPU i5 is probably enough but with your budget you might as well do a i7.
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #3 ·
Price inflation might be different Here compared to US, just quick search gets me Asus R9 290 4GB for £300, which would be compatible to $400 considering all components as i said will be here more pricy.
And i would love to go for I7, even if that will put me over budged. Don't mind borrowing, as long as it is not too much, I will be due a decent tax rebate just after May anyway.
 

·
Registered
Joined
·
1,166 Posts
I don't see the point of spending more money on an i7 over an i5 if you're not going to use the extra features. Hyperthreading is nice, the little extra cache is nice, but that's all you're going to get for a $100 premium. I don't like paying more just for a name but that's just me. Hyperthreading is good for well threaded applications like video encoding and stuff like that. For gaming? Almost irrelevant except for a few titles.

Extra money is always better spent on extra GPU power (780 Ti over an R9 290), better PSUs (modular over non-modular), better cases (windows, extra features, etc). A good monitor is also a good place to invest money in (120 Hz over 60 Hz, etc), quieter/better case fans; even a custom water loop might be better.

If you're in the UK, an R9 290 is an excellent buy, if you're in the USA you're really better off buying an Nvidia GPU. I recommend the Tri-X R9 290 because it's pretty much the best cooler on air that can be put on a 290. However it's more expensive than other 290s, up to you if the price is a problem or not. I would rather spend extra money on getting a better and quieter cooler for my 290 than getting an i7 over an i5, for example.

You should also say what resolution your two monitors are, their refresh rate, etc. If they're two cheap VGA 1080p monitors maybe it might be better to spend money on getting 120 Hz 1080p monitors, or maybe even a single huge 1440p.

This is where I would start at: http://uk.pcpartpicker.com/p/33Skc

It doesn't even break £1000.

Different possibilities include getting a beefier, modular PSU to be able to crossfire 290s down the road or something.
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #5 ·
at the moment it is indeed x2 1080 60fps monitors, last Peripheral to upgrade (wll be upgraded May or June, so for now just 1 GPU will do (will go for R9 290 Tri-X most likely).
Also, if Hyperthreading makes little to no difference, will i be better off just going with AMD FX 9590, got the CPU?
 

·
Registered
Joined
·
3,597 Posts
Quote:
Originally Posted by DrMetal View Post

at the moment it is indeed x2 1080 60fps monitors, last Peripheral to upgrade (wll be upgraded May or June, so for now just 1 GPU will do (will go for R9 290 Tri-X most likely).
Also, if Hyperthreading makes little to no difference, will i be better off just going with AMD FX 9590, got the CPU?
People throw $1000 on GPU's and try to save $100 on a CPU doesn't make since to me. When I build a system that will be used for some years. I get the best motherboard and CPU I can get and change GPU's when every I want.

I have 3930K. Yes expensive CPU but was cheaper than my GPU. I have changed my GPU's three times since I bought me CPU board combo. And I sill see no reason to have to change the CPU and motherboard for maybe about 2 years still, and will probably get a one or two more different GPU's before this board is retired.

I think spending $300 plus dollars a CPU with the budget you have is not a problem. If you had a smaller budget and just had no way to get a good card at all unless you had to cut the cost of the CPU to make room for a GPU. Then thats the only case I see that its worth saving $100 on a CPU. The CPU is the centralized part of the entire PC.
 

·
Premium Member
Joined
·
4,265 Posts
If you're playing games on one monitor and want to have programs open on the other (like a movie playing) then hyper-threading IS potentially useful to you, EVEN in games that don't use more than 4 cores.

Do you intend to overclock this machine? (CPU? GPU?)
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #8 ·
Quote:
Originally Posted by mdocod View Post

If you're playing games on one monitor and want to have programs open on the other (like a movie playing) then hyper-threading IS potentially useful to you, EVEN in games that don't use more than 4 cores.

Do you intend to overclock this machine? (CPU? GPU?)
No intention to over clock, not a for a while, as options I am looking at the moment are i7 or FX 9590.
Bigger question thou arises, just how much sway do consoles have over gaming development, as just checking it quickly, looks like both rocking AMD CPUs as well as GPUs, so i have few friends that are adamant, that will be biggest factor in how, many high end games, perform on PC system few years from now.
 

·
Registered
Joined
·
1,166 Posts
Quote:
Originally Posted by UNOE View Post

Quote:
Originally Posted by DrMetal View Post

at the moment it is indeed x2 1080 60fps monitors, last Peripheral to upgrade (wll be upgraded May or June, so for now just 1 GPU will do (will go for R9 290 Tri-X most likely).
Also, if Hyperthreading makes little to no difference, will i be better off just going with AMD FX 9590, got the CPU?
People throw $1000 on GPU's and try to save $100 on a CPU doesn't make since to me. When I build a system that will be used for some years. I get the best motherboard and CPU I can get and change GPU's when every I want.

I have 3930K. Yes expensive CPU but was cheaper than my GPU. I have changed my GPU's three times since I bought me CPU board combo. And I sill see no reason to have to change the CPU and motherboard for maybe about 2 years still, and will probably get a one or two more different GPU's before this board is retired.

I think spending $300 plus dollars a CPU with the budget you have is not a problem. If you had a smaller budget and just had no way to get a good card at all unless you had to cut the cost of the CPU to make room for a GPU. Then thats the only case I see that its worth saving $100 on a CPU. The CPU is the centralized part of the entire PC.
Yeah, except that at the moment out friend is only using a single graphics card. There are some people who won't mind throwing thousands of money at their GPUs. I'm not among those people so my advice reflects that.

Your processor is indeed beefy but it costs $550 and it's got 6 cores, a soldered IHS and LGA 2011 (which means the motherboard is also going to cost $300+). Unfortunately, it doesn't boast the great IPC that newer Haswells have, which is good for poorly optimized games like SC2. An overclocked 4670k will better run SC2 than an overclocked 3930k, because SC2 is poorly written.

I don't see how you can justify getting an intel hexacore when a quad-core of a new generation, for a gaming rig, does just as fine for way less money (this is especially true if you take motherboard costs into account). I'm all for buying great hardware but I don't like wasting extra money just because it's there to be spent.

Unless our friend has the intention of streaming games at 1080p48 or something. In which case, sure, you'll need a beefier processor.

That's why I don't like i7 over i5 for a gaming rig. It's a $100 premium just for hyperthreading and some cache. It's seriously just that. Not to mention that an i7 will run hotter than an i5 with hyperthreading on. I'm also somewhat skeptical as to hyperthreading helping with multiple monitors, though if given a source that says otherwise, I'll eat my words.

Overclocking with a budget such as this one is pretty much recommended. Above a certain budget you really get a performance boost overclocking things, not to be neglected at all. I wouldn't recommend AMD CPUs for a gaming rig. I could be wrong, check out some benchmarks, but I'm pretty sure that an overclocked 4670k blows any AMD out of the water in terms of single thread performance, which is what you want for games). I also include poorly optimized games like games of the source engine.

Well, whatever; this is just my advice. I don't like spending money without reason so i don't like recommending anything more than an i5, but that's just me. It's not like a 4670k with the overclocking equipment to go with it is "budget" either way. A single R9 290 should be enough for two monitors and if not an i5 can surely handle two of them in crossfire.
 

·
Premium Member
Joined
·
4,265 Posts
Hyper-threading doesn't help with multiple monitors, it helps with what people RUN on multiple monitors. Hyper-threading increases the average parallel compute throughput. It's the [partly] reason an i3-4130 can trade blows with an i5-2400S in parallel workloads. It's also the reason that some people get better results with an FX-8350 than a similarly priced i5. If i'm gaming on one monitor, and want to keep the stock tickers and a movie playing on the other, then I may have a way to leverage some additional parallelism afforded by hyper-threading.

Once you get past the FX-6300 (which would represent the entry level tuner/gaming performance platform) or an i3-4130 (a chip that delivers 65% of the performance of an i5-4670 for half the price), every rung of the performance ladder is potentially expensive and could be rationalized as poor value. (Note: you can buy the FX-6300 plus a very well made motherboard for less than the cost of the i5-4670K, so they are separated by well spaced rungs on the cost ladder). Considering the performance advantages of the 4670k over the FX-6300, and I could just as easily make a statement to the effect that it "isn't worth the $100 premium for so little" just like the i7 over i5 consideration. One could very reasonably and legitimately stop at EVERY imaginable rung of the ladder and make the case that it isn't worth another $100+ to step up another rung. As you go up the ladder, the rungs get further apart, and the performance gains are less and less for desktop machines. The i5-4670K costs 100% more than an FX-6300 yet will struggle to achieve 50% better performance, the i7-4770K costs half again as much as an i5, yet will rarely ever return more than 30% better performance, The 4930K is *almost* double the price of the 4770K and again, will struggle to deliver 30% performance improvements.

Once you get out of the material economics in the basement, where you're paying for the packaging and the raw materials and the convenience of being able to purchase it at-all, more than the actual performance (like the Celerons/A4/A6, most junker GPUs etc), and work your way up the desktop consumer ladder, you are always "approaching" the transition zone to an entirely different hardware economy: compute density. The closer you get to this separate "zone" of computer hardware economics, the less performance/$ you will get for a desktop machine. In the compute density economy, $1000-2000 CPUs are the norm, because they don't have to compete with each-other directly, but with what it would cost to IMPLEMENT each-other (more machines vs less machines, more space vs less space, overall compute efficiency and density, etc). The $2000 server CPU that is only 15% faster than the $1000 CPU in the compute density economy, is actually competing with what it would cost to implement the additional slower CPUs, ALL THINGS CONSIDERED (including the real-estate).

If you look at a machine as the price of the individual components, i5 vs i7 etc, it's easy to rationalize any stopping point. However, if you step back, and look at the machine *as a whole* (the same way the compute density economics work), those $100 "rungs" on the ladder are often only going to represent a 5-15% or so increase in the entire cost of the machine. A 10% cost increase, for up-to 30% performance increases (depending on workload), and suddenly the i7 vs i5 has been re-legitimized. The cost of the monitor/keyboard/mouse/speakers/motherboard/PSU/chassis/HD/SSD/ODD/RAM/OS(unless linux) can't be avoided. You're going to be "out" that money one way or another to build a rig. At that point, with $600-1000+ already invested, there's a lot of legitimate reasons to consider "getting the most" out of that semi-static investment by gracing that useless stack of stuff with a decent CPU and GPU, turning it into a computer.

Perspective changes everything. Furthermore, for many, there is *value* the novelty of owning particular nice things regardless of their performance/% ratio.

DrMetal,

The hardware configuration of the new game consoles isn't really "available" on the desktop right now, except in Kaveri (closest similar architecture), which represents a fraction of the GPU configuration compared to the console design. Any decent gaming rig built today, is not going to accept any sort of "direct port" that utilizes hardware the same way as it works on the console anyway. AM3+ is has been relegated to legacy status, and certainly does not represent AMDs "platform of the future" that will narrow the gap from console to PCs (easier porting, etc). If that's what you are after, then I would suggest building a high-value rig for now, and then re-check the hardware landscape in 2 years and see what has happened.
 

·
Registered
Joined
·
4,919 Posts
dont skimp on the psu and i dont mean power but quality
thumb.gif


seasonic is my favourite but there other good brands out there and i am sure the psu guru will help you

then decide on the graphics card you really want and work backwards for the rest
thumb.gif


Why i say backwards cos it gives you a better idea whats left and if it fits the bill - ie, ssd over a better card and so on!

Amd vs intel - well that is no mans land at the moment, not sure with all that amd is going for is going to be on the table working 100% too soon.
redface.gif


Only you can decide on that - but i.m sure you will be satisfied with either

FX 9590 is £216 at the mo at OCUK - forget it when others say -

The FX 9590 is just an overclocked FX 8350. The later can be bought much cheaper. - its a lottery
thumb.gif
 

·
Registered
Joined
·
1,166 Posts
Quote:
Originally Posted by mdocod View Post

Hyper-threading doesn't help with multiple monitors, it helps with what people RUN on multiple monitors. Hyper-threading increases the average parallel compute throughput. It's the [partly] reason an i3-4130 can trade blows with an i5-2400S in parallel workloads. It's also the reason that some people get better results with an FX-8350 than a similarly priced i5. If i'm gaming on one monitor, and want to keep the stock tickers and a movie playing on the other, then I may have a way to leverage some additional parallelism afforded by hyper-threading.

Once you get past the FX-6300 (which would represent the entry level tuner/gaming performance platform) or an i3-4130 (a chip that delivers 65% of the performance of an i5-4670 for half the price), every rung of the performance ladder is potentially expensive and could be rationalized as poor value. (Note: you can buy the FX-6300 plus a very well made motherboard for less than the cost of the i5-4670K, so they are separated by well spaced rungs on the cost ladder). Considering the performance advantages of the 4670k over the FX-6300, and I could just as easily make a statement to the effect that it "isn't worth the $100 premium for so little" just like the i7 over i5 consideration. One could very reasonably and legitimately stop at EVERY imaginable rung of the ladder and make the case that it isn't worth another $100+ to step up another rung. As you go up the ladder, the rungs get further apart, and the performance gains are less and less for desktop machines. The i5-4670K costs 100% more than an FX-6300 yet will struggle to achieve 50% better performance, the i7-4770K costs half again as much as an i5, yet will rarely ever return more than 30% better performance, The 4930K is *almost* double the price of the 4770K and again, will struggle to deliver 30% performance improvements.

Once you get out of the material economics in the basement, where you're paying for the packaging and the raw materials and the convenience of being able to purchase it at-all, more than the actual performance (like the Celerons/A4/A6, most junker GPUs etc), and work your way up the desktop consumer ladder, you are always "approaching" the transition zone to an entirely different hardware economy: compute density. The closer you get to this separate "zone" of computer hardware economics, the less performance/$ you will get for a desktop machine. In the compute density economy, $1000-2000 CPUs are the norm, because they don't have to compete with each-other directly, but with what it would cost to IMPLEMENT each-other (more machines vs less machines, more space vs less space, overall compute efficiency and density, etc). The $2000 server CPU that is only 15% faster than the $1000 CPU in the compute density economy, is actually competing with what it would cost to implement the additional slower CPUs, ALL THINGS CONSIDERED (including the real-estate).

If you look at a machine as the price of the individual components, i5 vs i7 etc, it's easy to rationalize any stopping point. However, if you step back, and look at the machine *as a whole* (the same way the compute density economics work), those $100 "rungs" on the ladder are often only going to represent a 5-15% or so increase in the entire cost of the machine. A 10% cost increase, for up-to 30% performance increases (depending on workload), and suddenly the i7 vs i5 has been re-legitimized. The cost of the monitor/keyboard/mouse/speakers/motherboard/PSU/chassis/HD/SSD/ODD/RAM/OS(unless linux) can't be avoided. You're going to be "out" that money one way or another to build a rig. At that point, with $600-1000+ already invested, there's a lot of legitimate reasons to consider "getting the most" out of that semi-static investment by gracing that useless stack of stuff with a decent CPU and GPU, turning it into a computer.

Perspective changes everything. Furthermore, for many, there is *value* the novelty of owning particular nice things regardless of their performance/% ratio.

DrMetal,

The hardware configuration of the new game consoles isn't really "available" on the desktop right now, except in Kaveri (closest similar architecture), which represents a fraction of the GPU configuration compared to the console design. Any decent gaming rig built today, is not going to accept any sort of "direct port" that utilizes hardware the same way as it works on the console anyway. AM3+ is has been relegated to legacy status, and certainly does not represent AMDs "platform of the future" that will narrow the gap from console to PCs (easier porting, etc). If that's what you are after, then I would suggest building a high-value rig for now, and then re-check the hardware landscape in 2 years and see what has happened.
Those are some very interesting points. Never thought about it that way.

Nonetheless, correct me if i'm wrong, but I don't believe hyperthreading actually helps with running multiple poorly optimized programs at once. Rather, hyperthreading helps a lot when it comes to very well optimized tasks: encoding (video, sound, streaming). Assuming our friend wants to stream at high resolution/quality then sure, the i7 pays for itself. If he's going to turn off hyperthreading to get a higher overclock out of the i7 since hyperthreading expels extra heat, it's kind of silly to get an i7 over an i5.

$100, even if it's less than %10 of the total system cost, still isn't nothing. $100 is the difference between an H81 and a good Z87. It's also pretty much a free PSU (with some money left over). I also think it's wrong to say that i7 performs 30% better than an i5. The situations where that's true are pretty limited, especially for a gaming rig. Call me stingy, but knocking off 10% on the total system cost seems like a pretty damn nice deal to me. Even 5% is good. There are only a handful of tasks where the i7 beats the i5 hands down and if I'm not going to be doing those tasks on a regular basis, I'll just keep my money and accept slightly lower performance whenever I will do those tasks.

Well, at this point I'm just playing the devil's advocate really. Thank you for the big post, no matter what you think is best, there are some interesting ideas in there I hadn't thought of before.
 

·
Registered
Joined
·
1,533 Posts
Instead of arguing, why not price out different options?
smile.gif


Just my 2 cents haha I'm going to sleep now, it's -10gmt so it's 2am here.

Budget gaming go for the 6300/6350, boards are cheaper now too. And spend your money on a nicer gpu and power supply. And all the other stuff like buying windows which will cost $100.
 

·
Premium Member
Joined
·
4,265 Posts
Quote:
Originally Posted by incog View Post

Those are some very interesting points. Never thought about it that way.
I've had a lot of time to think about it
wink.gif
I've been infatuated with computer hardware, comparing and contrasting it, since I was like 12 years old. (I'll be turning 30 in April).
Quote:
Nonetheless, correct me if i'm wrong, but I don't believe hyperthreading actually helps with running multiple poorly optimized programs at once. Rather, hyperthreading helps a lot when it comes to very well optimized tasks: encoding (video, sound, streaming). Assuming our friend wants to stream at high resolution/quality then sure, the i7 pays for itself. If he's going to turn off hyperthreading to get a higher overclock out of the i7 since hyperthreading expels extra heat, it's kind of silly to get an i7 over an i5.
Hyper-threading, is often misunderstood. In simplest terms, it can be summed up as follows: The ability execute an integer instruction, while simultaneously performing a floating point calculation.
It means that rather than have to choose between the 2 operations on each "cycle," (like a Pentium or i5 does), it can do both at the same time, within the same core. This improves IPC, and compute efficiency when leveraged (improved saturation minimizes losses, note: the i7 under a fully saturated workload will only use about 15% more power than an i5, while producing 30% higher compute throughput)

When hyper-threading first came out, leveraging it was more difficult, and the implementation had many flaws. Software wasn't compiled for it, and the OSs scheduling was garbage. Over a decade of refinements to the technology and software side changes (both in the OS's and in the way that software is compiled to take advantage) have lead to a very mature technology that can offer performance scaling in almost any mixed workload that has spawned enough threads (whether it's from different programs or not). Any mixed parallel workload (integer/FPU) can theoretically scale into hyper-threading these days. Hyper-threading, does indeed improve performance when a computer is multi-tasking. If this were not the case, they wouldn't bother with it at the server level, where the workload is often hundreds of separate services running. Only a handful of entry level Xeons (the cheapest in each class) have hyper-threading disabled.

If you want to see how good hyper-threading scales these days when the workload is parallel enough, look at the i3-4130 in gaming benchmarks (since almost all games leverage up to 3-4+ threads these days). Effectively an i7 that has been cut in half, it is able to keep up with the i5 better than it's "2-core" class might suggest. It is hyper-threading that closes the gap. In this particular comparison, the i3 is placed in workloads that are forced to scale into hyper-threading wherever they can because there is nowhere else to go. Often we see minimal scaling from the i5 to i7 in these same games because the i5 has already offered all the parallelism that the game engine can leverage. Appreciating the i7 in this case, demands that we add additional workload (like a second monitor with other apps running).

The only time hyper-threading increases thermal dissipation, is when it is being utilized. When it is being utilized, it will always be able to return more performance scaling than any additional overclock that would be afforded with it disabled. As I already pointed out, hyper-threading, under full saturation, can improve throughput up to 30% at a respectable 15% increase in power consumption. The same increase in power consumption applied towards an overclock would buy approximately a 5% clock speed improvement. Increasing clock speeds by 30%, would require a 60% increase in power consumption. If the workload in question doesn't have a way to leverage hyper-threading, then having it on or off should have very little effect on thermals for max overclocks because it is going unused anyway. Turning it off would just be a way to "optimize" an overclock for a specific workload (getting that extra 5% overclock for the desired workload while preventing the chip from overheating under an unexpected parallel workload, or a workload that the user does not care to optimize anyway).
Quote:
$100, even if it's less than %10 of the total system cost, still isn't nothing. $100 is the difference between an H81 and a good Z87. It's also pretty much a free PSU (with some money left over). I also think it's wrong to say that i7 performs 30% better than an i5.
I believe you may have inadvertently taken me out of context. I said "up-to 30%"

Interestingly enough, I was trying to be as conservative as I could with the 30% number to prevent this. There are some workloads where the scaling is even better due to a combination of hyper-threading AND the larger faster cache. Either way, I probably should have placed greater emphasis on the "up-to" part. (I'll try to remember to place such a point in bold in the future)

The ~$220-260 E3-1230V2/V3 series chips on a B85 board, can offer competitive performance with an i5-"K" on a Z board for less money if the workloads are parallel enough, especially for users who are questioning whether or not they want to overclock or not, or support multiple GPUs. (this thread may apply). The E3 can offer that "i7" class performance to people who don't need an iGPU. An E3+B85 may afford the opportunity to buy into more GPU, or an SSD.
Quote:
The situations where that's true are pretty limited, especially for a gaming rig. Call me stingy, but knocking off 10% on the total system cost seems like a pretty damn nice deal to me. Even 5% is good. There are only a handful of tasks where the i7 beats the i5 hands down and if I'm not going to be doing those tasks on a regular basis, I'll just keep my money and accept slightly lower performance whenever I will do those tasks.
The neat thing about it, is that, if you go through a process of rationalizing the i5, and then actually buy the i5, then you will have achieved a harmony with your rationalization that has value in and of itself. If someone else can rationalize the i7, or the E3, or the FX chip, and purchases it, then they will have achieved the same fulfillment that you have in your purchase decision.

I'm also a cheap-A$$ (ask my wife). Which is why I have a $110 CPU, a $50 MOBO, and a $50 HSF (less combined cost than an i5). I was able to rationalize that this would be the best value for me, and I love it. The novelty of overclocking it to the same performance as an E3-1230V2 in parallel workloads (a $226 chip) has been extraordinarily fun and rewarding.
Quote:
Well, at this point I'm just playing the devil's advocate really. Thank you for the big post, no matter what you think is best, there are some interesting ideas in there I hadn't thought of before.
I can appreciate this! Thank You!
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #17 ·
Thanks for nice Post mdocod
If i understand your post correctly, It worth investing into 4770k or 4930k for hyper threading in the way I am going to use the system.
Heat should not be a problem, I am looking into investing into Custom water cooling loop or a few (will need to find out, once all hard ware to confirmed, what i will need exactly to loop CPU and eventually 2 GPUs, no point worrying about that)
Just looking at it, it will have to be the slowest of 4930k, but I should be able to afford it, might have to borrow some money, but as I said, don't mind that part much for a good investment.
the question now becomes what mother board and memory to go with that? With mother board, will agree that is one of 2 components I do not want to cut any corners with (other one is PSU). But, motherboard is probably the hardest component to understand (in my opinion at least). And memory, at the moment in my plans corsair 8gb 1866mhz vengeance low kit (twice obv)), but opened for anything more optimal.
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #18 ·
Quote:
Originally Posted by SanguineDrone View Post

$100 is a lot of hotdogs.
Those hotdogs are a lot more important than the hyperthreading.
Thats how I fell about Beer, thats why after building this PC i will go and live with my friend for a week. Mooch of food, beer and utility bills? Why not.
Worse comes to worst in budget range, I always have option to sell my Motorbike (and associated gear) for quick cash. It was a silly thing top buy, but was necessary at a time, and I don't use it anymore.
Quote:
Originally Posted by mdocod View Post

Hyper-threading, is often misunderstood. In simplest terms, it can be summed up as follows: The ability execute an integer instruction, while simultaneously performing a floating point calculation.
Hey! I Remember learning about that in CPU Architecture Module in my second year. Being a dropout and never hearing about it again didn't improve my memory on it =3
question about Mobo for 4930k still stands thou.
 

·
Premium Member
Joined
·
4,265 Posts
I like the Asus X79 Deluxe and Crucial Ballistix Tactical 1.35V (the yellow sort of matches the board, and it's great quality RAM. Low voltage and tight timings means all sorts of overclocking headroom).

The 2011 socket platform is largely underutilized in gaming workloads. In the same way that the move from the i5 to the i7 on the 1150 socket platform has questionable scaling (though I think you might appreciate it), scaling up to a hex-core on the 2011 socket is even less return on investment. At that range, for a gaming build it is mostly about the novelty and fun of owning the behemoth flagship workstation/server platform. Having 14 SATA ports and over 50GB/s memory bandwidth becomes mostly just about having excess for the sake of having excess (Which can be fun, especially if you like to tinker, performance tune, or compete in performance tuning/benching etc).

Ask anyone who bought into X58 back in it's prime, and most will tell you that it was:

A: A lot of money, that could have been spread out into incremental upgrades on the regular consumer platform over the years instead.
B: A really fun platform to own and tinker with.
C: Still so powerful that there's no real compelling reason to upgrade, even 5 years later.
 

·
Registered
Joined
·
12 Posts
Discussion Starter · #20 ·
Option B and C sounds fun to me.
And nothing wrong with excess for sake of excess, that's one of principles which makes capitalism work after all.
Besides, If i have extra capabilities, not like there is nothing to do with extra processing power.
Streaming/skype HD movie and game playing at the same time. If the setup is indeed that excessive I don't have to shut down Antivirus, all the little visual OS UIs for just a little extra processing power.
Also color coordination doesn't matter much for me. For water cooling planning to get as many different color tubing as I can get, with same liberal attitude applied to any other lighting I will end up plasing inside it =3
 
1 - 20 of 22 Posts
Top