Overclock.net banner

1 - 20 of 30 Posts

·
Registered
Joined
·
67 Posts
Discussion Starter #1
To be clear, the most intensive thing I do is play video games on my Acer XB270HU bprz (G-Sync,144 Hz,1440p)

System specifications:

i7 2600k @ 4.4 Ghz
GTX 980 Ti @ 1.5 Ghz
8 GB DDR3 1600 Ram
Samsung 850 Pro SSD
ASUS Maximus V Gene

Lately, I feel like my CPU is bottenecking my GPU but I don't really have any benchmarks to support that claim since there's almost no Skylake reviews that touch on 1440p gaming with 2600k overclocked as well. I think I saw 1 YouTube video on it but it was inconclusive because it seemed like Skylake would benefit me by like 3%. I know the Digital Foundry video shows huge gains but they tested on 1080p and I game on 1440p where the bottleneck would be my GPU I assume. Anyone have any advice for me?
 

·
Indentified! On the Way!!
Joined
·
2,928 Posts
I'm also curious, though I would not get a 5820k. Don't misunderstand, it's the proc I'm using now, but with Broadwell-E just around the corner, I'd go Skylake or Broadwell-E if I was coming from Sandy.
 

·
Premium Member
Joined
·
20,153 Posts
With your current setup you can check yourself by using Afterburner to check usage. Uncheck other useless stuff like cpu temp during the test.

Other stuff you might have missed are . . . unlocking your cores, Power options to High performance in Windows, and disabling any power savings in BIOS.

One other thing is . . . oc the cpu a bit more like 4.7 GHz. Like you said, 1440 should put the load more on the gpu BUT, check with AB.

My i7 Sandy at 4.5 is just enuf to push both my 290s (about equal to your gpu). Here is an example . . .



Now when all your cores are pegged to 100%, the threads should do their part.
 

·
Registered
Joined
·
67 Posts
Discussion Starter #5
Quote:
Originally Posted by LancerVI View Post

I'm also curious, though I would not get a 5820k. Don't misunderstand, it's the proc I'm using now, but with Broadwell-E just around the corner, I'd go Skylake or Broadwell-E if I was coming from Sandy.
Yea, totally forgot about that...
Quote:
Originally Posted by rdr09 View Post

With your current setup you can check yourself by using Afterburner to check usage. Uncheck other useless stuff like cpu temp during the test.

Other stuff you might have missed are . . . unlocking your cores, Power options to High performance in Windows, and disabling any power savings in BIOS.

One other thing is . . . oc the cpu a bit more like 4.7 GHz. Like you said, 1440 should put the load more on the gpu BUT, check with AB.

My i7 Sandy at 4.5 is just enuf to push both my 290s (about equal to your gpu). Here is an example . . .



Now when all your cores are pegged to 100%, the threads should do their part.
Thanks, I'll keep that in mind when I hit some games next weekend.
 

·
Registered
Joined
·
109 Posts
I upgraded from a 2500k to 6700k and am very happy with the results.
Quote:
Originally Posted by Wijkert View Post

I would like to report back on my experience with my need hardware. As I stated in my original post, the goal was to remove a cpu bottleneck. I am happy to say that my 6700K at 4.6ghz is doing very well in games which had my 2500K struggling to keep up with my gpu. For example, when playing BO3 multiplayer or Assassins creed Unity (I actuality like this game, even though after several patches it is still buggy), the cpu cores no longer go up to 100% utilization and my gpu to less than 99%. This is clearly noticeable in game and I don't think it is because it now runs at a higher frame rate, but mostly because there is far less hitching previously caused by a cpu bottleneck. I realize I keep saying cpu bottleneck although the platform and the higher speed ram also changed and might play a part in running the games more smoothly.

Lets say that clock for clock the 6700K is about 25% faster than the 2500K (you guys can correct me if I am too far of), I would aspect the 6700K cores to run between 75-100% load in the same areas of AC Unity where my 2500K was running at 100% at all 4 cores. This is not what I am seeing. Instead it is more like 50-60%. I realize that hyperthreading is also a thing and AC Unity is using all 8 threads. And the new platform might also play a roll, but I would not have predicted this big of a increase in performance.
The above quote is from this thread.

The difference might be a little less for your system, because you game at a higher resolution (cpu bottleneck is less likely) and your cpu is already hyperthreaded.

If you have any questions, please let me know.
 

·
Registered
Joined
·
67 Posts
Discussion Starter #7
Quote:
Originally Posted by Wijkert View Post

I upgraded from a 2500k to 6700k and am very happy with the results.
The above quote is from this thread.

The difference might be a little less for your system, because you game at a higher resolution (cpu bottleneck is less likely) and your cpu is already hyperthreaded.

If you have any questions, please let me know.
I assume you are on 1080p? I know for a fact that 1080p bottlenecks the GPU but my concern was at 1440p is it also bottlenecking the GPU? Because if not I would like to just wait out another year and jump in next cycle. Thanks for your input though I appreciate it.
 

·
Registered
Joined
·
51 Posts
Interesting. I also have an i7 2600K but I'm considering moving to an i5 6600K simply because i7 6700K + mobo would be outside my budget and its not really necessary.

I'm thinking faster cores but less threads. Using cinebench I found i5 6600K is around 40% faster than my i7 2600K per GHz per thread. I do a bit of encoding and rendering but mainly gaming so HT isn't really needed.

Another option I'm considering is waiting for AMD zen.

Sent from my m8 using Tapatalk
 

·
Registered
Joined
·
67 Posts
Discussion Starter #9
Quote:
Originally Posted by fatboyslimerr View Post

Interesting. I also have an i7 2600K but I'm considering moving to an i5 6600K simply because i7 6700K + mobo would be outside my budget and its not really necessary.

I'm thinking faster cores but less threads. Using cinebench I found i5 6600K is around 40% faster than my i7 2600K per GHz per thread. I do a bit of encoding and rendering but mainly gaming so HT isn't really needed.

Another option I'm considering is waiting for AMD zen.

Sent from my m8 using Tapatalk
If you upgrade definitely go for the i7 6700k because newer games will inevitable be multithreaded since both of the consoles run 8-ccore Jaguar CPUs.
 

·
Registered
Joined
·
109 Posts
Quote:
Originally Posted by b4db0y View Post

I assume you are on 1080p? I know for a fact that 1080p bottlenecks the GPU but my concern was at 1440p is it also bottlenecking the GPU? Because if not I would like to just wait out another year and jump in next cycle. Thanks for your input though I appreciate it.
I play at 2560x1080 on a Z35 which has a maximum refresh of 200. When playing competitive games I lower the graphical settings to increase the framerate, but also increase the likelihood of a cpu bottleneck. I am pretty sure that most modern game engines spawn multiple threads and can use more then 4 cores/threads effectively.

As rdr09 mentioned, you can tell if your cpu is bottlenecking your gpu while gaming. The stutter that is caused by a cpu bottleneck is pretty profound and at least for me more annoying that stutter caused by frame rate fluctuations. I would advice playing with a overlay and if you encounter stuttering, see if your gpu is not running at full load (which might indicate a cpu bottleneck).
 

·
RPG Gamer
Joined
·
4,224 Posts
Upgrading your ram to 2133 is a good spot to help your CPU.
If you can try to OC a little higher to 4.6Ghz, can help too.

Even if you saw the video from DF gaming at 1080, 1440 would still be the same.

Prefer 2133 as you can use 1.5v, compared to 2400 needing 1.65v.
 

·
Registered
Joined
·
223 Posts
The 2600k is limited to PCIe 2.0 as well. While there isn't too much difference between PCIe 2.0 16x and 3.0 16x performance it is there.
 

·
Premium Member
Joined
·
4,166 Posts
Quote:
Originally Posted by Kana Chan View Post

better to just upgrade the ram for now and a full system upgrade later
~66 dollars for 2x8GB 2400c11

http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k

up to 18% faster with 2133mhz vs 1600

http://wccftech.com/fallout-4-performance-heavily-influenced-by-ram-speed-according-to-report/

22% but they used 2400mhz vs 1600
This.

You'll see some performance gains if you switch to a Skylake i7 (which is preferable to X99 if you're only looking for gaming performance), but you can get just as much by taking advantage of cheap DDR3 prices and extend the life of your Sandy through at least one more generation.
 

·
Registered
Joined
·
801 Posts
On a 2600k and a 980 ti on 1080p I usually get about 50% or less cpu usage while playing Doom. Usually stays in the 20-30% range.
 

·
RPG Gamer
Joined
·
4,224 Posts
Lets us know how it all worked out for you.
 

·
Registered
Joined
·
51 Posts
I've always thought Cas Latency was important for gaming. The MHz speed is simply the bandwidth and I can't imagine bandwidth is the limiting factor?

I use 1600MHz CL8 RAM on my i7 2600K for this reason.

Sent from my m8 using Tapatalk
 

·
Premium Member
Joined
·
10,755 Posts
Bandwidth usually matters more for the games that benefit from RAM performance but some of them like low latencies more than bandwidth
 
1 - 20 of 30 Posts
Top