FWIW, my upgrade path was...
Intel Celeron 466MHz
Intel Celeron 533MHz
AMD Athlon XP 2600+
AMD Mobile Athlon XP 3200+
AMD Opteron 144
AMD Opteron 165
Intel Core 2 Quad Q9650
Intel Core i7-3930k
Intel Core i7-5960x
AMD Ryzen Threadripper 1950X
I'm not a fanboy of either manufacturer. That being said, 5960x @ 4.7GHz is about on par, if not a smidgen faster, than TR 1950X @ 4.15GHz at single thread loads. TR exceeds in performance obviously in multithread in apps that can properly use all the threads, but I've had some hiccups in some things where too many threads doesn't work well. An example was OBS H.264 encoding. Unless I change setting to limit threads to 8 or less, it will drop tons of frames and seemingly not be able to keep up on encoding. It should be able to encode 1440p60 @ medium with ease but for whatever reason, thread count is a huge impact and more is not better. Thats only 1 instance though.
Theres also the issue of UMA vs NUMA. While UMA can be quicker for apps that like wide memory access, it comes with a hit to latency, which some games like arma 3 really hate. I'm talking to the degree of 70FPS on NUMA, but 45FPS with UMA. I have not yet used a new high core count intel with their new mesh L3 cache, but since it's all 1 contiguous die with no modules, it shouldn't suffer from thread breakpoints like TR does. It also has a wide memory access that maintains low latency, but then you're really digging into your wallet. TR has great multithread performance per $.
Each has their own pros and cons. This is just a brief amount of info that I have.
Intel Core i9-9900k (Silicon Lottery 5.1GHz @ 5.3GHz 1.36 VCore)
ASUS ROG MAXIMUS XI APEX
G.SKILL F4-4000C19D-32GTZSW (2x16GiB 4000MHz 19-19-19-39 2T, tuned to 4133MHz 14-17-17-31 2T)
EVGA NVIDIA GeForce GTX 1080 Ti FTW3
Seasonic PRIME 1300W 80+ Platinum