These are just my observations/research and could be incorrect. If you have any concerns or corrections, please share. The point of this thread is to collect as much useful information as possible, as I found this monitor to be a total pain in the rear when it first arrived, and was planning to return it, until eventually coming to love it.
[Specs]
Size: 34" Ultrawide
Resolution: 3440x1440 (with a LOT extra black pixels around the edges for pixel shifting)
Refresh Rate: 175Hz (8-Bit + Dithering), 144Hz (10-Bit)
Panel type: Samsung QD Oled (New diamond shaped subpixel arrangement)
Coating: Glossy looking AG coating, though doesn't produce a true gloss-like image as you get on LG Oled.
Peak HDR brightness (1% window, 1060 Nits, 100%, 260~ Nits)
Video Inputs: This is using an old Gen 1 G-Sync Ultimate module which I believe was an Intel FPGA. This limits the device to HDMI 2.0 and DP1.4. DP1.4 is required for G-Sync, as well as for maximum refresh rate. It also has a fan, similar to on the PG27UQ and X27. Depending on your specific unit and your specific sensitivities, you may or may not hear the fan. I personally haven't in my environment. But my environment is not dead silent.
(subpixel layout)
[Price/Availability]
Please use my affiliate links below. JK. Lol. All regions are backordered and availability is generally end of July/early August at the time of writing this post.
US: $1299 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell USA
Canada: $1649 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell Canada
UK: ÂŁ1,099.00 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell UK
eBay: Currently lowest sell price for buy it now is $2300 USD.
[Color depth]
As shared by @CallsignVega on HardForum, you can create a custom resolution at 177Hz, which will report a 10-bit signal to both Windows, as well as the DisplayHDR app through the windows app store. However, the Nvidia Control Panel still only shows 8-Bit as the only option under RGB. (Native 175Hz also shows 8-Bit here, but then reports 8-Bit + Dithering under Windows)
However, I'd need to see further testing on this as nothing is actually changed in order to achieve this. It's using all the same settings, but reducing the total vertical pixels by a small amount to keep the pixel clock below 1000MHz and get you 2 extra Hz. The Nvidia Control shows 8-Bit as the only color depth option, and when testing with the novideo_srgb, with dithering disabled or dithering manually set to 10-bit dithering, banding showed up when I tested with Cyberpunk in HDR with DCI-P3 color space. Changing it to 8-Bit Dithering removed the banding. Again I'm not an expert on this so my takeaway from this could be incorrect, but despite windows reporting it as a 10-bit display, there is still color banding that doesn't go away unless 8-Bit dithering is enabled, which would leave me to believe that 10-Bit color isn't actually being produced. If you have any information you can share regarding this, it would be greatly appreciated. Please share and I'll update.
[Refresh/Input Lag]
While I don't have any hardware to test, I have eyes to see with. A lot of reviewers, including highly respectable ones like Vincent from HDTV Test, did their input lag testing using a device which works through HDMI (which, to his credit, he mentioned as a limitation in his ability to fully test the unit). So they did a comparison at lower than native refresh rate. This monitor is designed to be used through DP1.4 on the G-Sync ultimate module. When testing games, switching from 144Hz to 175Hz, at the same in-game FPS, I saw a very noticeable difference in smoothness which I attribute to the response time/input lag of the monitor. So same FPS (around 80 in the test) but at 144Hz vs 175Hz, and the 175Hz was very clearly more smooth/quick. It was evident when moving the mouse back/forth quickly and repeatedly. It should be something anyone can test and feel, and not entirely subjective like the "sound stage" and "timbre" and "fullness" you may hear about speakers/headphones on audiophile forums (sorry not sorry if that's triggering for some of you). So keep in mind that the actual response time/input lag should be better than what most reviews reported, if they didn't do their testing at 175Hz through DP. So make sure you always stick to 175Hz.
[Changing Color space/Gamma]
This monitor ships with ALL STANDARD PRESETS locked to DCI-P3 and those modes do NOT have Gamma Control, outside of the "Dark Stabilizer" feature which, from my understanding, just changes the curve at the bottom to prevent black crush . Again I'm not an expert. But DCI-P3 can have oversaturated and exaggerated colors, and apparently even moreso in SDR. There are 2 ways to fix this:
1) Use the "Creator Mode" preset and you'll be able to change color space between DCI-P3 and sRGB. You'll also be able to change Gamma between 1.8 to 2.6. Again not an expert....but I don't believe these gamma values truly reflect what you'd expect. Regardless, it's there, and it still does change the gamma.
2) There is an app on Github called novideo_srgb. This uses an existing unused Nvidia API to convert color information before it's sent to the display. It's supposed to do so without any performance cost. There is a toggle on there that allows you to clamp colors. So you can stay on a DCI-P3 profile, and use this to clamp down the oversaturation/exaggeration of the colors. Simple as you see in the screenshot below.
Downloading here: GitHub - ledoge/novideo_srgb: Calibrate monitors to sRGB or other color spaces on NVIDIA GPUs, based on EDID data or ICC profiles
[Monitor Driver/ICC Profile]
Windows will automatically queue up a driver install for this monitor. When the driver is installed, it also installs an ICC profile from Dell. According to Neowin's article (Review: Meet the world's first 34" QD-OLED ultrawide monitor (AW3423DW) by Dell Alienware), this ICC profile is garbage, and could be part of the reason I hated the monitor when I first tested it. So it's recommended to uninstall it. In Windows 11, it can be done by this:
- Settings -> Display -> Advanced Display -> "Display Adapter Properties for Display 1(or whatever number)" -> Color Management tab -> Colore Management Button -> Tick the box that says "Use my settings for this device" -> Highlight the ICC profile -> Click Remove -> Continue.
PLEASE NOTE: EVERY WINDOWS UPDATE WILL REINSTALL THIS ICC PROFILE. So this process will have to be repeated. Not sure if there's a way to install a neutral ICC profile and set it as default to prevent this from being an issue after every single Windows update, which can be quite frequent. If you have any tips or tricks here, please share.
(image from different monitor/pc. just to illustrate where you need to go)
[Calibration/Color Correction]
This part is going to be highly subjective and contentious. In order to get the monitor to look how I wanted it to look, I ended up using the Game 1, Game 2, and Game 3 preset options. They don't let you change color space, or gamma. But they do let you manually adjust RGBCMY colors. I used this to make the image more punchy, without being overly saturated.
Please note: Once HDR is enabled, you can no longer change the "Brightness" setting in the monitor OSD. So, for myself and my usage, I set the Brightness setting to 100% for all 3 game modes in advance. You'll be able to change the contrast as well as RGBCMY color settings later, in HDR, while in game. I did progressively increased color levels for each profile because I felt some games really needed/benefited from it, while it could make other games look bad. Again this is all personal preference, but this is basically how you'd manage it to get the look that makes you happy.
[Vesa DisplayHDR400 vs HDR1000]
So from my observations, HDR400 is a lot more stable, particularly on the desktop. Online reviews have shown the same thing, including an oddly higher full page screen brightness with HDR400. But since you're dealing with a more limited range, specular highlights in games won't pop quite as much. So I'm currently using HDR1000 for all my gaming. And can just disable HDR on desktop if I'm going to be doing any work.
The profile for HDR400 and HDR1000 also affect all your contrast/brightness/color settings. So switching back and forth between them is likely not going to be an ideal solution. Pick one, and stick to it. Especially because switching between them actually activates an entirely different monitor profile. So as far as your computer is concerned, you've now plugged in a brand new monitor. Meaning any settings, including custom resolutions, don't carry over. Also if you have an AVR connected to your PC like me, it'll switch to it as your main display. I had to connect with teamviewer on my phone and fix that.
[Image Quality]
Once more a subjective area. Comparing to my PG27UQ (4K 27" HDR1000/QD LCD/FALD/144Hz) there was a massive drop in clarity, and a massive increase in jagged edges. This makes sense as the PG27UQ had a PPI of 163, while this monitor has a PPI of 109. I was going to return it, until learning to utilize a mix of DLDSR with DLSS. Running with DLDSR 2.25x and DLSS Quality results in a significant drop in performance. But it also makes the image absolutely pristine. You can also get away with DLDSR 1.78x with DLSS Performance, which is a less clear image, but good trade off between image quality and performance as textures are improved, and distant objects become more visible, while jagged edges are greatly reduced.
Currently, depending on game and performance headroom, I experiment with different levels of DLDSR with DLSS and I've been very happy with the image presentation when running both at max. This gives me great hope for the ability of this monitor to continue delivering top notch visuals when next gen video cards come out.
Important to note: I couldn't imagine myself keeping this monitor and using it without DLDSR. There is too great a loss of visual detail and tons of aliasing compared to my previous monitor, and even compared to my 77" OLED which technically has worse PPI, but due to the distance I sit from it, and perhaps other factors, 4K 77" from a couch looks much better than 3440x1440 34" if you're not using DLDSR.
[Anti-Glare Coating Controversy]
It's important to note that the coating, which looks glossy, can light up to gray (similar to how most Plasma TVs looked) when bright light hits it at certain angles. I have 6 LED light bulbs on the ceiling in my media room, and none of them cause that to happen. So I can have all my room lights on, and not have an issue. But if I turn on a light in the hallway directly behind my monitor, it instantly makes the black panel turn gray. Even in that situation, I haven't noticed a change in contrast levels in most gaming, if the brightness of the panel is high enough. It could be a problem if I were consuming content that relied on a lot of pitch black scenes without any bright elements to force that contrast from your pupils adjusting to the brightness.
[Text Quality/Fringe]
Due to the diamond subpixel layout, and Windows not being designed to render for such an unusual format, text can look quite poor. Display resolution scaling can help. So can using cleartype. But it still won't be as good as an IPS/VA/WOLED panel. Several people have mentioned using an app called Better ClearType Tuner, which you can download from here: GitHub - bp2008/BetterClearTypeTuner: A better way to configure ClearType font smoothing on Windows 10. . It may help, but remember that either way, none of these are designed around the diamond subpixel layout of this panel. So they're basically "hacks" that may be somewhat effective. But a full solution won't become available unless Windows updates for it. Using a display scaling resolution of 125% makes readability acceptable for me. But your mileage may vary.
Regarding the pink/green lines around high contrast edges that has been reported...honestly I haven't seen them in my day to day use. I'm sure they're there. I took a close up picture of a car in Cyberpunk 2077 to show the bad aliasing to a friend and in the close up saw these pink lines that I couldn't actually see otherwise. My generally opinion is that if I haven't noticed them from a normal distance during normal use, I'd like to keep it that way. Don't want it to become of those things that once seen can't be unseen.
[Conclusion]
Good monitor if you're coming from 4K but only if you can afford to run DLDSR. If you're already on a 27" 1440P monitor, then you're accustomed to this PPI and the only downside will be a reduction in text clarity. Beyond that, colors and brightness are superb and much better than on my LG OLED, as is the perceived responsiveness even at lower FPS (while running the 175Hz refresh rate). I found that I could reliably play Cyberpunk 2077 locked to 65fps, without any of the delay/lack of responsiveness/target tracking issues I'd normally have at that frame rate.
So overall...it can be a great monitor. It just takes a lot of tweaking to make it work, imo. Will update/change/add as required and based on interest.
[External Reviews]
tftcentral.co.uk
There are other reviews but they're either very basic and lacking, or have problematic testing (for example, using HDMI test equipment on a panel that requires DP). I tend to stick to tftcentral and rtings, but if you've come across another noteworthy technical review of the monitor that you think would benefit people, please share and I'll add it here.
[Specs]
Size: 34" Ultrawide
Resolution: 3440x1440 (with a LOT extra black pixels around the edges for pixel shifting)
Refresh Rate: 175Hz (8-Bit + Dithering), 144Hz (10-Bit)
Panel type: Samsung QD Oled (New diamond shaped subpixel arrangement)
Coating: Glossy looking AG coating, though doesn't produce a true gloss-like image as you get on LG Oled.
Peak HDR brightness (1% window, 1060 Nits, 100%, 260~ Nits)
Video Inputs: This is using an old Gen 1 G-Sync Ultimate module which I believe was an Intel FPGA. This limits the device to HDMI 2.0 and DP1.4. DP1.4 is required for G-Sync, as well as for maximum refresh rate. It also has a fan, similar to on the PG27UQ and X27. Depending on your specific unit and your specific sensitivities, you may or may not hear the fan. I personally haven't in my environment. But my environment is not dead silent.
(subpixel layout)
[Price/Availability]
Please use my affiliate links below. JK. Lol. All regions are backordered and availability is generally end of July/early August at the time of writing this post.
US: $1299 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell USA
Canada: $1649 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell Canada
UK: ÂŁ1,099.00 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell UK
eBay: Currently lowest sell price for buy it now is $2300 USD.
[Color depth]
As shared by @CallsignVega on HardForum, you can create a custom resolution at 177Hz, which will report a 10-bit signal to both Windows, as well as the DisplayHDR app through the windows app store. However, the Nvidia Control Panel still only shows 8-Bit as the only option under RGB. (Native 175Hz also shows 8-Bit here, but then reports 8-Bit + Dithering under Windows)
However, I'd need to see further testing on this as nothing is actually changed in order to achieve this. It's using all the same settings, but reducing the total vertical pixels by a small amount to keep the pixel clock below 1000MHz and get you 2 extra Hz. The Nvidia Control shows 8-Bit as the only color depth option, and when testing with the novideo_srgb, with dithering disabled or dithering manually set to 10-bit dithering, banding showed up when I tested with Cyberpunk in HDR with DCI-P3 color space. Changing it to 8-Bit Dithering removed the banding. Again I'm not an expert on this so my takeaway from this could be incorrect, but despite windows reporting it as a 10-bit display, there is still color banding that doesn't go away unless 8-Bit dithering is enabled, which would leave me to believe that 10-Bit color isn't actually being produced. If you have any information you can share regarding this, it would be greatly appreciated. Please share and I'll update.
[Refresh/Input Lag]
While I don't have any hardware to test, I have eyes to see with. A lot of reviewers, including highly respectable ones like Vincent from HDTV Test, did their input lag testing using a device which works through HDMI (which, to his credit, he mentioned as a limitation in his ability to fully test the unit). So they did a comparison at lower than native refresh rate. This monitor is designed to be used through DP1.4 on the G-Sync ultimate module. When testing games, switching from 144Hz to 175Hz, at the same in-game FPS, I saw a very noticeable difference in smoothness which I attribute to the response time/input lag of the monitor. So same FPS (around 80 in the test) but at 144Hz vs 175Hz, and the 175Hz was very clearly more smooth/quick. It was evident when moving the mouse back/forth quickly and repeatedly. It should be something anyone can test and feel, and not entirely subjective like the "sound stage" and "timbre" and "fullness" you may hear about speakers/headphones on audiophile forums (sorry not sorry if that's triggering for some of you). So keep in mind that the actual response time/input lag should be better than what most reviews reported, if they didn't do their testing at 175Hz through DP. So make sure you always stick to 175Hz.
[Changing Color space/Gamma]
This monitor ships with ALL STANDARD PRESETS locked to DCI-P3 and those modes do NOT have Gamma Control, outside of the "Dark Stabilizer" feature which, from my understanding, just changes the curve at the bottom to prevent black crush . Again I'm not an expert. But DCI-P3 can have oversaturated and exaggerated colors, and apparently even moreso in SDR. There are 2 ways to fix this:
1) Use the "Creator Mode" preset and you'll be able to change color space between DCI-P3 and sRGB. You'll also be able to change Gamma between 1.8 to 2.6. Again not an expert....but I don't believe these gamma values truly reflect what you'd expect. Regardless, it's there, and it still does change the gamma.
2) There is an app on Github called novideo_srgb. This uses an existing unused Nvidia API to convert color information before it's sent to the display. It's supposed to do so without any performance cost. There is a toggle on there that allows you to clamp colors. So you can stay on a DCI-P3 profile, and use this to clamp down the oversaturation/exaggeration of the colors. Simple as you see in the screenshot below.
Downloading here: GitHub - ledoge/novideo_srgb: Calibrate monitors to sRGB or other color spaces on NVIDIA GPUs, based on EDID data or ICC profiles
[Monitor Driver/ICC Profile]
Windows will automatically queue up a driver install for this monitor. When the driver is installed, it also installs an ICC profile from Dell. According to Neowin's article (Review: Meet the world's first 34" QD-OLED ultrawide monitor (AW3423DW) by Dell Alienware), this ICC profile is garbage, and could be part of the reason I hated the monitor when I first tested it. So it's recommended to uninstall it. In Windows 11, it can be done by this:
- Settings -> Display -> Advanced Display -> "Display Adapter Properties for Display 1(or whatever number)" -> Color Management tab -> Colore Management Button -> Tick the box that says "Use my settings for this device" -> Highlight the ICC profile -> Click Remove -> Continue.
PLEASE NOTE: EVERY WINDOWS UPDATE WILL REINSTALL THIS ICC PROFILE. So this process will have to be repeated. Not sure if there's a way to install a neutral ICC profile and set it as default to prevent this from being an issue after every single Windows update, which can be quite frequent. If you have any tips or tricks here, please share.
(image from different monitor/pc. just to illustrate where you need to go)
[Calibration/Color Correction]
This part is going to be highly subjective and contentious. In order to get the monitor to look how I wanted it to look, I ended up using the Game 1, Game 2, and Game 3 preset options. They don't let you change color space, or gamma. But they do let you manually adjust RGBCMY colors. I used this to make the image more punchy, without being overly saturated.
Please note: Once HDR is enabled, you can no longer change the "Brightness" setting in the monitor OSD. So, for myself and my usage, I set the Brightness setting to 100% for all 3 game modes in advance. You'll be able to change the contrast as well as RGBCMY color settings later, in HDR, while in game. I did progressively increased color levels for each profile because I felt some games really needed/benefited from it, while it could make other games look bad. Again this is all personal preference, but this is basically how you'd manage it to get the look that makes you happy.
[Vesa DisplayHDR400 vs HDR1000]
So from my observations, HDR400 is a lot more stable, particularly on the desktop. Online reviews have shown the same thing, including an oddly higher full page screen brightness with HDR400. But since you're dealing with a more limited range, specular highlights in games won't pop quite as much. So I'm currently using HDR1000 for all my gaming. And can just disable HDR on desktop if I'm going to be doing any work.
The profile for HDR400 and HDR1000 also affect all your contrast/brightness/color settings. So switching back and forth between them is likely not going to be an ideal solution. Pick one, and stick to it. Especially because switching between them actually activates an entirely different monitor profile. So as far as your computer is concerned, you've now plugged in a brand new monitor. Meaning any settings, including custom resolutions, don't carry over. Also if you have an AVR connected to your PC like me, it'll switch to it as your main display. I had to connect with teamviewer on my phone and fix that.
[Image Quality]
Once more a subjective area. Comparing to my PG27UQ (4K 27" HDR1000/QD LCD/FALD/144Hz) there was a massive drop in clarity, and a massive increase in jagged edges. This makes sense as the PG27UQ had a PPI of 163, while this monitor has a PPI of 109. I was going to return it, until learning to utilize a mix of DLDSR with DLSS. Running with DLDSR 2.25x and DLSS Quality results in a significant drop in performance. But it also makes the image absolutely pristine. You can also get away with DLDSR 1.78x with DLSS Performance, which is a less clear image, but good trade off between image quality and performance as textures are improved, and distant objects become more visible, while jagged edges are greatly reduced.
Currently, depending on game and performance headroom, I experiment with different levels of DLDSR with DLSS and I've been very happy with the image presentation when running both at max. This gives me great hope for the ability of this monitor to continue delivering top notch visuals when next gen video cards come out.
Important to note: I couldn't imagine myself keeping this monitor and using it without DLDSR. There is too great a loss of visual detail and tons of aliasing compared to my previous monitor, and even compared to my 77" OLED which technically has worse PPI, but due to the distance I sit from it, and perhaps other factors, 4K 77" from a couch looks much better than 3440x1440 34" if you're not using DLDSR.
[Anti-Glare Coating Controversy]
It's important to note that the coating, which looks glossy, can light up to gray (similar to how most Plasma TVs looked) when bright light hits it at certain angles. I have 6 LED light bulbs on the ceiling in my media room, and none of them cause that to happen. So I can have all my room lights on, and not have an issue. But if I turn on a light in the hallway directly behind my monitor, it instantly makes the black panel turn gray. Even in that situation, I haven't noticed a change in contrast levels in most gaming, if the brightness of the panel is high enough. It could be a problem if I were consuming content that relied on a lot of pitch black scenes without any bright elements to force that contrast from your pupils adjusting to the brightness.
[Text Quality/Fringe]
Due to the diamond subpixel layout, and Windows not being designed to render for such an unusual format, text can look quite poor. Display resolution scaling can help. So can using cleartype. But it still won't be as good as an IPS/VA/WOLED panel. Several people have mentioned using an app called Better ClearType Tuner, which you can download from here: GitHub - bp2008/BetterClearTypeTuner: A better way to configure ClearType font smoothing on Windows 10. . It may help, but remember that either way, none of these are designed around the diamond subpixel layout of this panel. So they're basically "hacks" that may be somewhat effective. But a full solution won't become available unless Windows updates for it. Using a display scaling resolution of 125% makes readability acceptable for me. But your mileage may vary.
Regarding the pink/green lines around high contrast edges that has been reported...honestly I haven't seen them in my day to day use. I'm sure they're there. I took a close up picture of a car in Cyberpunk 2077 to show the bad aliasing to a friend and in the close up saw these pink lines that I couldn't actually see otherwise. My generally opinion is that if I haven't noticed them from a normal distance during normal use, I'd like to keep it that way. Don't want it to become of those things that once seen can't be unseen.
[Conclusion]
Good monitor if you're coming from 4K but only if you can afford to run DLDSR. If you're already on a 27" 1440P monitor, then you're accustomed to this PPI and the only downside will be a reduction in text clarity. Beyond that, colors and brightness are superb and much better than on my LG OLED, as is the perceived responsiveness even at lower FPS (while running the 175Hz refresh rate). I found that I could reliably play Cyberpunk 2077 locked to 65fps, without any of the delay/lack of responsiveness/target tracking issues I'd normally have at that frame rate.
So overall...it can be a great monitor. It just takes a lot of tweaking to make it work, imo. Will update/change/add as required and based on interest.
[External Reviews]

Dell Alienware AW3423DW review - TFTCentral
34" ultrawide based on a new QD-OLED panel. With 175Hz refresh rate, 0.1ms G2G, NVIDIA G-sync and excellent HDR capabilities

There are other reviews but they're either very basic and lacking, or have problematic testing (for example, using HDMI test equipment on a panel that requires DP). I tend to stick to tftcentral and rtings, but if you've come across another noteworthy technical review of the monitor that you think would benefit people, please share and I'll add it here.