First question:
Why do you need 64GB RAM?
The reason I ask? I think that you probably don't./
If your reason for the RAM is "multitasking", I'll tell you, straight up: your OS will either become unstable, unresponsive, or periodically lock up for anything between many milliseconds to many seconds, with enough things open to use over 32GB RAM at once.
I have a 5800X3D as well - your best bet is to get two kits of: Patriot Viper Steel DDR4 16GB (2 x 8GB) 4400MHz - PVS416G440C9K
They're Samsung B-die and will allow you to run 3600-3800MHz C14 - right up near the 5800X3D's comfortable maximum.
If you don't want or can't get them specifically, you want something that'll do 3600-3800 C14
Your 5800X3D configured this way will last your 7900XTX and beyond - well beyond! Most likely it'll do a more than adequate job of driving top end card of AMD's next next generation!
That is, unless you plan to game at obscenely low resolutions.
That wouldn't be smart though, because they're making GPUs with more shaders now, not faster shaders. This allows for higher resolutions at up to a certain framerate. Higher framerates are enabled by faster GPU core speeds, not more shaders (considering the way in which game engines use them).
Consider the performance of a couple cards at 1080P Ultra settings: specifically comparing the 3080 (with shaders that run at ~1700MHz) to the 4090 (~2500MHz).
The 4090 is about 25/17 times faster than the 3080. If it was possible to run the 3080 at 2500MHz, it would almost tie the 4090 at 1080P (sans gimmicks).
Why? Because the 8704 shaders in the 3080 cover the 1080p resolution, and don't need to be run through twice (or more), dropping the framerate. This is an oversimplification, and I'm not the person to describe every detail, but most basically, this is how it is.
Theoretically, if AMD's next next gen's best GPU has 4x the shaders of the 7900XTX (which, admittedly, is a bit unlikely IMO, but go with it), and they run at 3200MHz, instead of gaming something at 100fps at 4K, you could do the same game at 8K, with the framerate being 32/25 times higher (128fps) - (32/25 from 3200MHz/ 2500MHz) (yes the 7900XTX and the 4090 both "boost" {on paper} to 2500)
Now...
Do games evolve and get more complicated? Yes.
Do engines change in ways we don't expect? Yes.
But consoles are here, and they're here to stay. And games need to run on them, too.
Because of this, the 5800X3D should be good enough to run most games very very well, from 2022-2030, and if you pop in a GPU upgrade in 2026-2027? You're golden at an even higher resolution (with maybe some new forms of eye candy, too) 'til 2030!
IF
and this is a big IF
IF your system ends up needing 64GB RAM to run games in 2026-2027 (0.1%)
all you have to do, is buy some used DDR4 from someone on here! There'll be plenty, because the 5800X3D is the best gaming chip that runs it! As people retire their old, you get it for a very reasonable price! Probably $200. Which shouldn't be a concern if you're dropping 2k at the same time for a GPU. You could probably sell your amazing B-die here for $100+ to recover half the cost.
I think it's a waste of money to buy more than 32GB RAM for your system right now. Unless, like I said, you specifically need it for something that isn't general multitasking.
edit: almost forgot. At 2030 this machine isn't toast - it'll still be fast enough to play all those games at mid-tier GPU level 'til 2034!
At or around 2030, a new console might come out with 16 cores, making 16 cores the new standard. Just like you can still run games on 4 core CPUs today, they'll still run on 8 in the future, probably very playable. Just not state-of-the-art.
edit2: before you go thinking it's ridiculous that any chip will last 12 years, it's not! The 5800X3D is uniquely positioned to last an extra 4 years because of its cache and extremely well designed microarchitecture (almost unchanged in 7800X3D - just RAM and MHz)_.
It's like my 2500K - it was uniquely positioned because of its ability to OC its cores by 40% with a 10-30% increase in IPC and 60+% RAM OC potential. And AVX!
2500K (2011) @ 4.8GHz & 16GB DDR3 2133 C8 (EIGHT) + 1080Ti (2017)? EXCEPTIONALLY MATCHED
2080Ti? little bit of bottlenecking, and some games started being optimized for more than 4 cores, disadvantaging a little bit.
But literally, that 6 year old CPU (2500K@4.8) paired with a brand new GPU (1080Ti), placed side by side for an a/b comparison next to a 7600K/8350K with 1080Ti? Even the very experienced would have a hard time differentiating.
That'll be your 5800X3D, almost. The Sandy Bridge (2500K/2600K) was a unique situation because AMD was soooo far behind so Intel, so Intel lazed around. Now, with the duopoly so close to each other in performance, and needing to compete with ARM etc., things will move faster. But now we're running up against density/power problems. Can't know the future for sure! But the 5800X3D with 32GB (for now at least) DDR4 3800 C14, is a solid choice.