Overclock.net banner

1 - 11 of 11 Posts

·
Still kinda lost
Joined
·
3,342 Posts
Discussion Starter #1 (Edited)
I know its subject to a lot of perspectives but I feel that the people that got a 7700k and a 1080ti on release had the best (Strictly Gaming) future proof builds, (Specially if it was a good clocker 5Ghz? and a high speed ram kit)
Considering how bad the IPC gains were in the past 6 years and how the 10 series dominated for a long time and the disappointing 20 series GPUs release.

So for anyone looking for future proofing its all about that RNG luck. :p (Educated guess can help but rarely work in this crazy market) but the only given fact is that the next 2 years its going to be an open war as AMD is holding Intel by the balls and is creeping towards Nvidia.


Just wrote this to hear your opinions and start a discussion.
 

·
Avid Memer
Joined
·
5,940 Posts
The i7-7700K was objectively one of the worst buys the day it released. It wasn't negligibly better than the i7-6700K it replaced. Kaby Lake was one of the biggest wastes of time in the last decade. We would have been much better off if Intel skipped Kaby Lake altogether and went straight to Coffee Lake.

The GTX 1080 Ti was a bargain when it was around $700. The price eventually ballooned over $1000. The initial Turing lineup was considered bad because the increase in traditional rasterization performance was underwhelming so we were paying for RTX features that were not well supported at the time. When Nvidia came out with the Super refresh and dropped the price of the RTX 2060, it was much easier to justify a Turing card. The RTX 2070 Super performs similarly to the GTX 1080 Ti at the $500 price point.
 
  • Rep+
Reactions: Veii

·
Bench, Dead, Squat
Joined
·
421 Posts
I agree with @chessmyantidrug, I might be of a different mindset if I was able to upgrade my CPU without having to replace my 600 dollar motherboard. Can't argue with the value of the 1080ti at launch though. I bought mine at 999.99 (CAD) and a year later the same card was worth more at retailers thanks to crypto craze.
 

·
Avid Memer
Joined
·
5,940 Posts
Hardware Unboxed didn't think the i7-7700K was a very good buy, either.

 

·
Still kinda lost
Joined
·
3,342 Posts
Discussion Starter #5
I won't disagree with that point in the end the 8700k was 8 months away with a little bit more clocks and 2 extra cores, but a lot of people usually upgrade their PCs in cycles so based on a lot of sigs on OCN and a lot of people I know the 10 series and 7600k,7700k was a really famous combo (Although they were a few months apart) and the 7700k + 1080ti was the defacto high end gaming build for a few months.

In the end upgrading to a 1080ti while sticking with that 4770k for example and then getting the 8700k would've been a better deal all around (As it will be able to survive even longer) but even in HU video (Which was released in the perfect time :D ) it still shows its power in a lot of games as we are just starting to get benefit from the extra cores.

My point as its not really that clear in the first page, is for $1500 on release those two parts would've made a killer PC that would survive easily till the release of 3000 series + 4th gen or 11th gen CPUs as today a $1500 isn't as great of a value as it used to be.

So I feel that people that got those on release for pure gaming in 2017 would've watched a lot of disappointing releases in the 3 up coming years. (With a lot of hype for what ever ryzen is about to do to the market :D )
 

·
Vandelay Industries
Joined
·
1,937 Posts
Looking back on "future proofing" (something I don't necessarily think is a valid thing) I'd have to compare it to other examples. In this instance I would have to compare it to an i7 2600k and a 7970. Thinking about how long that combo was relevant the i7-7700k and 1080ti are okay but not in the same league. I definitely agree it's (fp) a gamble and not worth striving for.
 

·
Avid Memer
Joined
·
5,940 Posts
The i7-7700K wasn't pointless because of how soon the i7-8700K came out; it was pointless because of how similar it was to the i7-6700K. It offered literally no performance increase. There was no reason for Kaby Lake to exist. Intel would have been better off skipping it and going straight to Coffee Lake.
 

·
Hardware Aficionado
Joined
·
819 Posts
I'd agree the 1080ti has been the most future-proof / best long-term "bang for your buck" GPU of the past 10 years.

For CPU, I wouldn't go with anything 1151. That award would go have to go to the original 6-core i7's (970,980,980x,990x) and the X58 platform generally.
 

·
Overclock the World
Joined
·
1,904 Posts
Looking back in the days
I think the 6700-7700K was about an equal upgrade as today the 3600 to a 3600XT
Sadly i can't remember the prices back then, but i think it's comparable incl B550 prices to today's scenery

It should be quite comparable to even gaming performance where only with finetuned memory cpu's like the 3300X make sense and deliver very similar results
Considering back in time zen 1 was out. a 1700X finetuned having around the same single core perf as a 4700K ~ it's questionable
I don't think it as a bad cpu at all back in time, LGA1151 was like Zen a big milestone at least for memory OCers
People back then till today could educate themself a lot on that part ~ while i think on Zen memory OC at least showed big result differences
Even when we have to take into recognition the architectural bottleneck of >72ns memory latency for two whole generations

Questionable really, but i think both CPUs where great for learning usecases
Intel on the Core OC side of things and timings beyond 4000MT/s
AMD with the importance of Signal Integrity and low voltage combining with combating EMI :)

For Gaming, the 7700K was a good option although today pretty much beaten by anything AMD 3rd gen
For Productivity and Media Encoding, the 1920X/1950X was a very valuable learning experience
 

·
Registered
Joined
·
2,722 Posts
I can't speak for GPUs since I'm sort of out of the loop on the last few years of those.

For CPUs, I'd say you'd have to give that to the second or third generation Core i5s or Core i7s (for some examples, 2500K or 2600K especially, but also 3570K or 3770K, etc.). While there were some IPC and clock speed improvements to be had after that time, they were small and incremental, which was a departure from the bigger jumps seen beforehand. Intel also got really stingy on adding cores after that point (unless you went HEDT, and even then the first Generation Core i7 had 6 core offerings), so Sandy Bridge/Ivy Bridge represent sort of the beginning of a plateau of sorts in my mind. Those are therefore the CPUs that I believe were the best value and performance longevity in recent times, no question. The more newer quad core CPUs are at the other end of that spectrum, and I think won't end up being quite as good as they won't enjoy the same to such an extent.

I think the current and upcoming 8 core/16 thread CPUs are most likely to be the next ones to enjoy longevity like that, barring some breakthrough in IPC and/or clock speed within the next couple of generations (I'm not holding my breath), but really, hardware, at least CPUs in general, are often just lasting longer, and this becomes more likely the more cores you have (though it's still important to not get excessive and buy what you don't need or the value prospect goes down).

Edit: Another factor I just thought of, which is probably overlooked but actually very important, is pricing of platforms and especially RAM. The X58 wasn't the cheapest platform at the time, and it also came at the time DDR3 was new and more expensive (furthered by needing 50% more modules for triple channel). Sandy Bridge was enough an improvement over the prior generation in most aspects; with decreased power draw/heat (I think, memory might be failing me on this one but I recall Nehalem being a Netburst 2.0 of sorts when it came to heat, at least compared to the cool running Core 2 Duos of the time), but certainly in performance with increased IPC, clock speed, and overclock limits, and it was landing around the time DDR3 was incredibly cheap, so much so that 16 GB of DDR4 was barely, if any, cheaper than 16 GB of DDR3 nearly ten years later! That's crazy. Sandy Bridge was really just the perfect timing and storm of everything.
 

·
Avid Memer
Joined
·
5,940 Posts
There's no argument that can be made for the i7-7700K. It was an i7-6700K on a more refined node, so better clocks out of the box. Performance between the two was indistinguishable. There was no reason for the i7-7700K to exist.

I have a hard time calling a $700+ GPU the best bang for the buck. The R9 290X was available for under $300 at times, which was a much better value. The best value among GPUs is usually around the $300 price point. Even among Pascal cards, the GTX 1070 represented a better value.
 
1 - 11 of 11 Posts
Top