Overclock.net banner
81 - 100 of 124 Posts
Quote:
Originally Posted by iRUSH View Post

What type of testing would you suggest? There's already a ton of other synthetic tests that are already done.

I'm all for new testing methods ☺
F@H, BOINC, 3D rendering tests, or other software that uses the CPU fully.

At least these are a few I can think of.
 
Quote:
Originally Posted by Avant Garde View Post

Of course I will not play or work anything at 1080p, 1440p should be the new standard resolution. I'ts hard because I'm going with 1440p 100Hz but for some odd reasons there are just a few game benchmarks at this resolution, they are mainly at 1080p where 7700K is the clear winner.
If you are going with 1440p, there's little to no difference between which the ryzen 5 and core i5 cpus. They all seem to perform the same at 1440p+.

http://www.relaxedtech.com/reviews/amd/ryzen-5-1500x-1600x/3
http://www.relaxedtech.com/reviews/amd/ryzen-5-1500x-1600x/4
 
Quote:
Originally Posted by caswow View Post

people need to stop comparing strictly cores vs. cores. it doesnt make any sense at all. they can have 300 cores as long as it competes in perf/watt
rolleyes.gif
For you maybe, but personally, if my CPU uses twice the watts and is 10% better than the competition, I want that 10%. This is OCN, where watts are put aside for a fraction of performance extra.

Heck, maybe that's why we're all using too big power supplies subconsciously.

If that's all that counts, by your logic, my HTPC would be the best CPU on the market. i5 7300U. It's skylake i7 single core performance with 2 cores and 4 threads for a whooping 15W.
 
Quote:
Originally Posted by Duality92 View Post

For you maybe, but personally, if my CPU uses twice the watts and is 10% better than the competition, I want that 10%. This is OCN, where watts are put aside for a fraction of performance extra.

Heck, maybe that's why we're all using too big power supplies subconsciously.

If that's all that counts, by your logic, my HTPC would be the best CPU on the market. i5 7300U. It's skylake i7 single core performance with 2 cores and 4 threads for a whooping 15W.
Guess you haven't been around when people constantly complained about wattage. the crowd concerned about power consumption is way bigger than the ocn enthusiasts crowd who cares about performance only. It's one of the biggest motivators on all the new cpu tech and probably also the biggest reason performance has come in small bumps as well.
 
Quote:
Originally Posted by dagget3450 View Post

Guess you haven't been around when people constantly complained about wattage. the crowd concerned about power consumption is way bigger than the ocn enthusiasts crowd who cares about performance only. It's one of the biggest motivators on all the new cpu tech and probably also the biggest reason performance has come in small bumps as well.
But why? Power is like $0.06 per KW/h here (CAD pricing). That means a 200W CPU, running at full load 24/7 for a month would cost about $8.64 per month.

Now, let's be realistic. Overclocked, these are pulling in the 150W range, we're already down to $6.48 per month. Let's take into consideration that a normal user, is going to game 4 hours a day (quite a bit more than it probably is) it becomes $1.08 per month. Even less if you're not overclocking. So, normally, a locked processor under the same conditions, is gonna draw half that, maybe. So about $0.54 a month. We're only at $0.54 more a month then. What's the fuss about? Even is you're paying four times as much for electricity, it's only costing you $2 more per month for a 150W processor than a 75W one.

This is for USA and Canada as I don't know the prices elsewhere.

I'm honestly trying to understand.
 
Quote:
Originally Posted by caswow View Post

23,74 Ct/kWh thats the cheapest i can get here so no it matters how much my cpu burns.
that's still just $2 more a month if you comapre a 150w processor versus a 75w one....

150W, 4 hours a day, 30 day month, $0.2374 per KW/h = $4.2732 a month.
75W , 4 hours a day, 30 day month, $0.2374 per KW/h = $2.1366 a month.

Is $2.15 less a month really worth that performance you're losing?

I'm looking for real facts as to why it actually matters. I'm not trying to start a flame war or anything, I'm legit looking for justifications because I fail to see it(them).

edit : I think your GPU has a much greater influence here. If you're all that worried about power consumption, you'd be better off staying on 1080p, 60hz and using low graphic settings to be able to use a low power GPU, something like a 1050 underclocked. That would mean you'd pull 60-70W from the GPU instead of the over 200W high end cards can pull on top of what high end monitors pull from the wall.

Cutting the 200W from your GPU/monitor, you'd save like $6+ a month. Three times as much as the CPU.
 
Quote:
Originally Posted by budgetgamer120 View Post

Do you run your CPU at 100%?
That's one of my points also, I'm using, for my example, a CPU at 100% for 4 hours. No CPU is at 100% load consistently when playing a game. The actual $ of my example above is much less, but even by exagerating, I can't see any justifiable arguments to go for performance/wattage.
 
The wattage/power consumption has been a staple of intel advocators ever since they were ahead in that department. Now ryzen is competitive in this regard and i see it rarely mentioned. Instead its been 7700k is still the best gaming cpu. I personally don't care about cpu wattage as much as performance. I am just stating how it seems to have been playing out on forums like ocn.

If i get anything Ryzen wattage will be my last concern as well. I am waiting to see what they do on workstation level though and thats probably end of q2 knowing AMD.
 
While Intel chips still has the edge in terms of maximizing GPUs, Ryzen is just about equal everywhere else. For the majority of mobile X86 users, both processors are going to be equal. Intel is about to feel the burn in the mobile space. Forget all this Intel loyalty nonsense, with an ever shrinking market, PC OEMs will give AMD more business if it means making more profits whilst providing a similar or even better experience.
 
Quote:
Originally Posted by AmericanLoco View Post

No you won't. 75W is nothing.
75W is ~1.3 incandescent lightbulbs worth of heat (none of it going to light).... If you don't have an intuition on how much heat and delta T incandescent lightbulbs put out, you probably don't live in hot climates.

Q = 75 W * 1 hr = 270 kJ

Standard bedroom size: 12 ft x 12 ft x 8 ft -> 32.6 m^3 * 1.225 kg/m3 = 39.9 kg

Specific heat of air: ~1.0 kJ/(kg K)

delta T = 270 kJ / (1kJ/(kg K) * 39.9 kg) = 6.77K

Of course it's not going to be nearly that high because of air circulation, but that's only for one hour of use and even a few degrees difference is going to feel like a massive increase in the summer heat.
 
7600K beats 6800K In all Solidwork, Catia , Creo's Bench!
 
81 - 100 of 124 Posts