Show me a native 1440p sharpness test that this display can't pass. I'm going to be a bit off-topic in this reply to make more sense.
Usually, when/if post-processing is disabled, sharpness can't be adjusted via display controls on most LED/LCD displays, be they monitors or TV's. Sharpness is post-processing, adjust either through display OSD (hardware) or tools like ReShade/NVidia CP (software). Post-processing adds to input lag. "Neutral sharpness" is 0 sharpness, no adjustment at all. Without processing, hardware/technology-based display image is 1:1 RGB mode with 4:4:4 sub-sampling. It is achieved by disabling all gimmicks, like sharpness, black stabilizer, dynamic contrast, certain "game" modes. Once all that is turned off and display is calibrated, you get to witness hardware-based image and only then assess quality.
Example: OLED with true and almost infinite static contrast ratio, beautiful deep blacks, will outshine some LED/LCD display with a gimmick known as "dynamic contrast ratio" of 10000000000000000000:1 if both display a standard image without gimmicks enabled, but poorly calibrated OLED may show dim ugly images compared to some bright crap LED LCD with high brightness and "vibrant mode". That's what HT enthusiasts know. If you don't believe me, check out AVS forums. It's why it makes so much more sense to keep up with display technology and specific models than to visit BestBuy and look at TV's.
Another example: Plasma TV's were and still are with superior image quality (compared to LED LCD's, not OLED's) and still superior in motion, but due to brightness, in stores, they looked dim and awful compared to bright LED LCD's, which was one of the reasons plasma was abandoned. Clueless average shoppers killed it along with store employees who demonstrated them in bright light environments. In dark, movie-theater-like environment, plasma's were un-matched in image quality until OLED and HDR.
G-Sync has reliability and input lag advantages over FreeSync and as such it can't be used with post-processing or all post-processing integrated into the panel, unlike FreeSync, which can. Lack or reduced post-processing means advantage for input lag reduction, stability and reliability, but disadvantage for those who like post-processing. You can look at comparison here -
https://gapintelligence.com/system/pictures/1141/content_freesync_vs_g-sync.jpg . I don't want to be offensive in the spirit of holidays, but given how such advantages are undesirable to you and sharpness post-processing is desirable along with more vibrant (often called Dynamic/Vibrant or Cartoon mode on TV's), yet inaccurate colors, there can't possibly be a monitor better than the one you think is the best at any point and time. There's no standard for you, only pure preference, your opinion has nothing objective in it, it can't be validated with quantitative or qualitative data. Both panels are the same with G-Sync/FreeSync being the #1 difference. It is that G-Sync vs. FreeSync difference that doesn't allow G-Sync version to have sharpness controls. It's pre-set/forced at 0/neutral. If G-Sync panel, in your opinion, has sharpness issues, then so does FreeSync version, but it allows post-processing changes, such as sharpness manipulation and wider gamut support. Wider colorspace does not improve image quality either. Non-HDR films and games stick to sRGB / Rec. 709 colorspace. If a developed tried to draw a life-like apple with unsaturated reds, seeing it with saturated bright reds due to wider gamut simply makes you NOT see what developer intended for you to see. If you don't care for that, then the concept of color accuracy is meaningless to you, no reason to even look into it. There nothing wrong with that either - look at how many ReShade / SweetFX presets there are that change game visuals to where you definitely don't see the game as developers made it, regardless of display calibration and color accuracy. Thing is, you never know whether those ReShade presets would even be created if their creators had good displays with accurate colors. Perhaps, they were just trying to compensate for incorrect gamma and grayscale. I want to see films and games the way developers made them, so I can at least judge them with less bias. You obviously don't.
I prefer better hardware without gimmicks, for which I can compensate with known tools, such as ReShade for post-processing, something I used regardless of display to enforce DisplayCAL 3DLUT's, and for example, mix LumaSharpen with NVidia's blurry TXAA to get rid of aliasing and retain good sharpness. For playback, I use madVR that comes with a ton of high quality post-processing no display hardware can even mimic. High quality hardware + high quality software = win-win

. I can't make any of it any clearer than that. It is what it is.