I don't have a GDDR6X card, but I have some info that may be of use! You can calculate the gradient across a thermal interface if you know thickness, K value, heat output, and area of the heat source. The results surprised me when I was shown the equation.
(thickness in meters X watts of heat) / (area of heat source in sqr meters X W/mK)
With memory chips, the die is much smaller than the plastic package, I'm not sure the die size of GDDR6X, but GDDR5 is roughly 5x5mm. Since the package is a poor conductor of heat, the important dimension to consider is die size.
So for a rough example : 1.0mm 6W/mK pad and 5W of heat coming from a 5x5mm die :
(0.001m x 5W) / ((0.005m x 0.005m) x 6W/mK) = 33.3c gradient
Meaning the chip will theoretically always be 33c hotter than the heatsink. Obviously a lot of heat can be dissipated into the board, so this is a very rough example. Calculating it all accurately is above my pay grade.
If you decrease the gap to 0.5mm, you cut the temp difference in half. If you were to use 15x15mm copper shims and quality thermal paste directly on the memory chips themselves, the gradient between the chip and copper should be relatively small, and that would spread the heat to 9x the area, in the example. So ignoring the inefficiency between chip and shim, that's an 18x improvement. Again, this is extremely oversimplified as far as actual thermodynamics, but you can at least see why thermal pads suck for cooling memory.
Would be awesome if someone with a 3080/3090 could try some shims directly on the chips and thinner pads between shim and cooler / backplate
There's also TG-PP10, a 10W/mK thermal putty that you can get from Digikey. No need to worry about having the perfect thickness of pads if you use that, and you could fill more of the gap with a thicker shim for better cooling.