Originally Posted by ORL
You wont notice anything on the front end of performance in day to day operation unless you do heavy compute loads. At best you will see some 1ish fps gain for each 10fps for gaming as a general figure. So basically if your getting 100 FPS in a game, you might see 110FPS in the same game and so forth. Though some CPU intensive games may yield better performance gains if they are bottle necking you. The CPU in your current rig is still decent enough that honestly, you likely wont even notice the upgrade unless you police frame rates.
I am in the same boat as you, a FX-8350 @ 5Ghz pushing a just purchased RTX2060. When I did step back and research it, I discovered exactly what was outlined above. Id almost say, tweak your system a bit, eek out a bit more performance wherever you can, and wait for at least another generation. Unless of course the itch has got you.
No offence but don't spread bad info please, as this is just not true.
Going from a 2500k, which is a 4 core 4 thread CPU to a new gen CPU is a huge difference. Pretty much most games will yield a decent performance boost.
I went from a 2600k at 4.8ghz to this 8700k at 4.8Ghz and while FPS were still high and decent with the 2600k, in CPU bound games like Mafia 3, Crysis 3, Assassins Creed, Watchdogs etc I could notice a decent jump in the minimum FPS. Need for Speed for example, cruising around town would dip to 55 60FPS from 75hz and it would stay rock solid at 75FPS after the upgrade, and this is with the 2600k, which has 8 threads.
Upgrading from a 2500k, regardless of how fast it runs, be it even 5Ghz, to a modern day CPU, say along the lines of 9700k, 8700k, 2700x or even 9900k, will be a simply massive improvement.