Overclock.net banner

Alienware AW3423DW QD OLED Owner's Thread

44K views 61 replies 28 participants last post by  Ne01 OnnA  
#1 · (Edited)
These are just my observations/research and could be incorrect. If you have any concerns or corrections, please share. The point of this thread is to collect as much useful information as possible, as I found this monitor to be a total pain in the rear when it first arrived, and was planning to return it, until eventually coming to love it.

Image



[Specs]
Size: 34" Ultrawide
Resolution: 3440x1440 (with a LOT extra black pixels around the edges for pixel shifting)
Refresh Rate: 175Hz (8-Bit + Dithering), 144Hz (10-Bit)
Panel type: Samsung QD Oled (New diamond shaped subpixel arrangement)
Coating: Glossy looking AG coating, though doesn't produce a true gloss-like image as you get on LG Oled.
Peak HDR brightness (1% window, 1060 Nits, 100%, 260~ Nits)
Video Inputs: This is using an old Gen 1 G-Sync Ultimate module which I believe was an Intel FPGA. This limits the device to HDMI 2.0 and DP1.4. DP1.4 is required for G-Sync, as well as for maximum refresh rate. It also has a fan, similar to on the PG27UQ and X27. Depending on your specific unit and your specific sensitivities, you may or may not hear the fan. I personally haven't in my environment. But my environment is not dead silent.

Image

(subpixel layout)

[Price/Availability]
Please use my affiliate links below. JK. Lol. All regions are backordered and availability is generally end of July/early August at the time of writing this post.
US: $1299 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell USA
Canada: $1649 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell Canada
UK: ÂŁ1,099.00 Alienware 34 Curved QD-OLED Gaming Monitor - AW3423DW | Dell UK
eBay: Currently lowest sell price for buy it now is $2300 USD.

[Color depth]
As shared by @CallsignVega on HardForum, you can create a custom resolution at 177Hz, which will report a 10-bit signal to both Windows, as well as the DisplayHDR app through the windows app store. However, the Nvidia Control Panel still only shows 8-Bit as the only option under RGB. (Native 175Hz also shows 8-Bit here, but then reports 8-Bit + Dithering under Windows)
Image


However, I'd need to see further testing on this as nothing is actually changed in order to achieve this. It's using all the same settings, but reducing the total vertical pixels by a small amount to keep the pixel clock below 1000MHz and get you 2 extra Hz. The Nvidia Control shows 8-Bit as the only color depth option, and when testing with the novideo_srgb, with dithering disabled or dithering manually set to 10-bit dithering, banding showed up when I tested with Cyberpunk in HDR with DCI-P3 color space. Changing it to 8-Bit Dithering removed the banding. Again I'm not an expert on this so my takeaway from this could be incorrect, but despite windows reporting it as a 10-bit display, there is still color banding that doesn't go away unless 8-Bit dithering is enabled, which would leave me to believe that 10-Bit color isn't actually being produced. If you have any information you can share regarding this, it would be greatly appreciated. Please share and I'll update.
Image


[Refresh/Input Lag]
While I don't have any hardware to test, I have eyes to see with. A lot of reviewers, including highly respectable ones like Vincent from HDTV Test, did their input lag testing using a device which works through HDMI (which, to his credit, he mentioned as a limitation in his ability to fully test the unit). So they did a comparison at lower than native refresh rate. This monitor is designed to be used through DP1.4 on the G-Sync ultimate module. When testing games, switching from 144Hz to 175Hz, at the same in-game FPS, I saw a very noticeable difference in smoothness which I attribute to the response time/input lag of the monitor. So same FPS (around 80 in the test) but at 144Hz vs 175Hz, and the 175Hz was very clearly more smooth/quick. It was evident when moving the mouse back/forth quickly and repeatedly. It should be something anyone can test and feel, and not entirely subjective like the "sound stage" and "timbre" and "fullness" you may hear about speakers/headphones on audiophile forums (sorry not sorry if that's triggering for some of you). So keep in mind that the actual response time/input lag should be better than what most reviews reported, if they didn't do their testing at 175Hz through DP. So make sure you always stick to 175Hz.

[Changing Color space/Gamma]
This monitor ships with ALL STANDARD PRESETS locked to DCI-P3 and those modes do NOT have Gamma Control, outside of the "Dark Stabilizer" feature which, from my understanding, just changes the curve at the bottom to prevent black crush . Again I'm not an expert. But DCI-P3 can have oversaturated and exaggerated colors, and apparently even moreso in SDR. There are 2 ways to fix this:

1) Use the "Creator Mode" preset and you'll be able to change color space between DCI-P3 and sRGB. You'll also be able to change Gamma between 1.8 to 2.6. Again not an expert....but I don't believe these gamma values truly reflect what you'd expect. Regardless, it's there, and it still does change the gamma.

2) There is an app on Github called novideo_srgb. This uses an existing unused Nvidia API to convert color information before it's sent to the display. It's supposed to do so without any performance cost. There is a toggle on there that allows you to clamp colors. So you can stay on a DCI-P3 profile, and use this to clamp down the oversaturation/exaggeration of the colors. Simple as you see in the screenshot below.

Downloading here: GitHub - ledoge/novideo_srgb: Calibrate monitors to sRGB or other color spaces on NVIDIA GPUs, based on EDID data or ICC profiles

Image


[Monitor Driver/ICC Profile]
Windows will automatically queue up a driver install for this monitor. When the driver is installed, it also installs an ICC profile from Dell. According to Neowin's article (Review: Meet the world's first 34" QD-OLED ultrawide monitor (AW3423DW) by Dell Alienware), this ICC profile is garbage, and could be part of the reason I hated the monitor when I first tested it. So it's recommended to uninstall it. In Windows 11, it can be done by this:

- Settings -> Display -> Advanced Display -> "Display Adapter Properties for Display 1(or whatever number)" -> Color Management tab -> Colore Management Button -> Tick the box that says "Use my settings for this device" -> Highlight the ICC profile -> Click Remove -> Continue.

PLEASE NOTE: EVERY WINDOWS UPDATE WILL REINSTALL THIS ICC PROFILE. So this process will have to be repeated. Not sure if there's a way to install a neutral ICC profile and set it as default to prevent this from being an issue after every single Windows update, which can be quite frequent. If you have any tips or tricks here, please share.
Image

(image from different monitor/pc. just to illustrate where you need to go)


[Calibration/Color Correction]
This part is going to be highly subjective and contentious. In order to get the monitor to look how I wanted it to look, I ended up using the Game 1, Game 2, and Game 3 preset options. They don't let you change color space, or gamma. But they do let you manually adjust RGBCMY colors. I used this to make the image more punchy, without being overly saturated.

Please note: Once HDR is enabled, you can no longer change the "Brightness" setting in the monitor OSD. So, for myself and my usage, I set the Brightness setting to 100% for all 3 game modes in advance. You'll be able to change the contrast as well as RGBCMY color settings later, in HDR, while in game. I did progressively increased color levels for each profile because I felt some games really needed/benefited from it, while it could make other games look bad. Again this is all personal preference, but this is basically how you'd manage it to get the look that makes you happy.

[Vesa DisplayHDR400 vs HDR1000]
So from my observations, HDR400 is a lot more stable, particularly on the desktop. Online reviews have shown the same thing, including an oddly higher full page screen brightness with HDR400. But since you're dealing with a more limited range, specular highlights in games won't pop quite as much. So I'm currently using HDR1000 for all my gaming. And can just disable HDR on desktop if I'm going to be doing any work.

The profile for HDR400 and HDR1000 also affect all your contrast/brightness/color settings. So switching back and forth between them is likely not going to be an ideal solution. Pick one, and stick to it. Especially because switching between them actually activates an entirely different monitor profile. So as far as your computer is concerned, you've now plugged in a brand new monitor. Meaning any settings, including custom resolutions, don't carry over. Also if you have an AVR connected to your PC like me, it'll switch to it as your main display. I had to connect with teamviewer on my phone and fix that.

[Image Quality]
Once more a subjective area. Comparing to my PG27UQ (4K 27" HDR1000/QD LCD/FALD/144Hz) there was a massive drop in clarity, and a massive increase in jagged edges. This makes sense as the PG27UQ had a PPI of 163, while this monitor has a PPI of 109. I was going to return it, until learning to utilize a mix of DLDSR with DLSS. Running with DLDSR 2.25x and DLSS Quality results in a significant drop in performance. But it also makes the image absolutely pristine. You can also get away with DLDSR 1.78x with DLSS Performance, which is a less clear image, but good trade off between image quality and performance as textures are improved, and distant objects become more visible, while jagged edges are greatly reduced.

Currently, depending on game and performance headroom, I experiment with different levels of DLDSR with DLSS and I've been very happy with the image presentation when running both at max. This gives me great hope for the ability of this monitor to continue delivering top notch visuals when next gen video cards come out.

Important to note: I couldn't imagine myself keeping this monitor and using it without DLDSR. There is too great a loss of visual detail and tons of aliasing compared to my previous monitor, and even compared to my 77" OLED which technically has worse PPI, but due to the distance I sit from it, and perhaps other factors, 4K 77" from a couch looks much better than 3440x1440 34" if you're not using DLDSR.

[Anti-Glare Coating Controversy]
It's important to note that the coating, which looks glossy, can light up to gray (similar to how most Plasma TVs looked) when bright light hits it at certain angles. I have 6 LED light bulbs on the ceiling in my media room, and none of them cause that to happen. So I can have all my room lights on, and not have an issue. But if I turn on a light in the hallway directly behind my monitor, it instantly makes the black panel turn gray. Even in that situation, I haven't noticed a change in contrast levels in most gaming, if the brightness of the panel is high enough. It could be a problem if I were consuming content that relied on a lot of pitch black scenes without any bright elements to force that contrast from your pupils adjusting to the brightness.

[Text Quality/Fringe]
Due to the diamond subpixel layout, and Windows not being designed to render for such an unusual format, text can look quite poor. Display resolution scaling can help. So can using cleartype. But it still won't be as good as an IPS/VA/WOLED panel. Several people have mentioned using an app called Better ClearType Tuner, which you can download from here: GitHub - bp2008/BetterClearTypeTuner: A better way to configure ClearType font smoothing on Windows 10. . It may help, but remember that either way, none of these are designed around the diamond subpixel layout of this panel. So they're basically "hacks" that may be somewhat effective. But a full solution won't become available unless Windows updates for it. Using a display scaling resolution of 125% makes readability acceptable for me. But your mileage may vary.

Image



Regarding the pink/green lines around high contrast edges that has been reported...honestly I haven't seen them in my day to day use. I'm sure they're there. I took a close up picture of a car in Cyberpunk 2077 to show the bad aliasing to a friend and in the close up saw these pink lines that I couldn't actually see otherwise. My generally opinion is that if I haven't noticed them from a normal distance during normal use, I'd like to keep it that way. Don't want it to become of those things that once seen can't be unseen.

[Conclusion]
Good monitor if you're coming from 4K but only if you can afford to run DLDSR. If you're already on a 27" 1440P monitor, then you're accustomed to this PPI and the only downside will be a reduction in text clarity. Beyond that, colors and brightness are superb and much better than on my LG OLED, as is the perceived responsiveness even at lower FPS (while running the 175Hz refresh rate). I found that I could reliably play Cyberpunk 2077 locked to 65fps, without any of the delay/lack of responsiveness/target tracking issues I'd normally have at that frame rate.

So overall...it can be a great monitor. It just takes a lot of tweaking to make it work, imo. Will update/change/add as required and based on interest.


[External Reviews]

There are other reviews but they're either very basic and lacking, or have problematic testing (for example, using HDMI test equipment on a panel that requires DP). I tend to stick to tftcentral and rtings, but if you've come across another noteworthy technical review of the monitor that you think would benefit people, please share and I'll add it here.
 
#5 ·
The Nvidia g-sync module 2.0 is way too expensive. I'm glad Alienware dropped it to make this somewhat affordable. I am currently rocking a G9, G7, and 34-inch ultrawide from ACER. I'm happy with them so far but could scoop this up later during the year to replace my 34-inch.
 
#6 ·
Arguable decision. I understand why they did from a mass market sale price position. But if I'm speaking for just myself, If I'm paying $1300 for 175Hz 8-bit because motion clarity, response time, and image quality are important to me, would I not pay an extra $300 if it were 240Hz 10-bit? I think the fact that they're backordered 3 months indicates that demand is definitely higher than current supply so they could have gone with a higher price point and also made it a more compelling premium top end product that didn't come with caveats like "but if you run 175Hz it's 8-bit, so stick to 144Hz."
 
#8 ·
Once more a subjective area. Comparing to my PG27UQ (4K 27" HDR1000/QD LCD/FALD/144Hz) there was a massive drop in clarity, and a massive increase in jagged edges. This makes sense as the PG27UQ had a PPI of 163, while this monitor has a PPI of 109. I was going to return it, until learning to utilize a mix of DLDSR with DLSS. Running with DLDSR 2.25x and DLSS Quality results in a significant drop in performance. But it also makes the image absolutely pristine. You can also get away with DLDSR 1.78x with DLSS Performance, which is a less clear image, but good trade off between image quality and performance as textures are improved, and distant objects become more visible, while jagged edges are greatly reduced.

Currently, depending on game and performance headroom, I experiment with different levels of DLDSR with DLSS and I've been very happy with the image presentation when running both at max. This gives me great hope for the ability of this monitor to continue delivering top notch visuals when next gen video cards come out.

Important to note: I couldn't imagine myself keeping this monitor and using it without DLDSR. There is too great a loss of visual detail and tons of aliasing compared to my previous monitor, and even compared to my 77" OLED which technically has worse PPI, but due to the distance I sit from it, and perhaps other factors, 4K 77" from a couch looks much better than 3440x1440 34" if you're not using DLDSR.
This is something that particularly interests me. I've been considering upgrading from 38" 3840x1600 to 40" 5120x2160 in the future since I am sitting at ~110 PPI for like 7 years now and I certainly don't find it to be enough without heavy AA or downsampling, but given how unlikely gaming 40" 5K2K monitors are to release in any kind of reasonable future and how a lot of older games that I am playing will become unplayable above 110 PPI due to interface getting to small, I think I am going to have to stay with 3840x1600, especially now that DLDSR is available. I was already using DSR in almost every game, combined with DLSS also, but it had a lot of drawbacks, like loss of clarity, poor scaling with fonts and text and etc, so it was more a like a heavy third party forced AA solution with some notable drawbacks rather than resolution increase. But with DLDSR things are a lot different, it does not have any of those issues and it is actually so good that even fonts and UI are looking better, so it does very much feel like a real resolution increase, but I haven't compared native 4K screen with 4K DLDSR on 1440p yet. But from what you are saying using 4K DLDSR on 110 PPI 1440p screen gets you close enough to native 160 PPI 4K? Say that 1440p is 66% quality and 4K is 100% quality, how much % would you give to 4K DLDSR on 1440p display? It is certainly a huge increase over native 1440p but I am wondering where it slots in exactly vs native 4K.
 
#9 ·
I would say it depends on what aspect of image quality matters to you most. For me, jagged edges and shimmering completely ruin the immersion. I would rather have a softer image with no aliasing than a sharper image with aliasing and shimmering. Pixel density/PPI is definitely underrated because other than the effect it has on edges and aliasing, it can also make textures look "real," especially with HDR. There will be more detail present than your eye can see, which is natural for our vision. You can move closer to the monitor, and reveal even more detail that your eye wasn't able to clearly pick up at normal viewing distance. Whereas on this 109 PPI monitor at the same distance as my 163 PPI monitor, my eyes can clearly tell that I'm looking at an image that is a representation of an object. And it can look good for an image. But my eyes will never see something on it and think "wow this looks real."

What DLDSR and DLSS can do is enable the monitor to show images as cleanly as possible at that resolution, without the downfalls normally present with on the fly rendering (gaming) at that resolution. So you use it to make textures present themself at their best, to a maximum of what 109 PPI can resolve/display. You use it to hide the aliasing and shimmering and reveal missing fine detail at a distance. Basically you try to get it closer to looking like what a CGI rendered movie would look like on that 109 PPI 1440P display. But it's no different than watching a 4K movie on a 1440P display. It won't have aliasing or shimmering even at 1440P. But it's still 1440P, not 4K. And that movie will look much better on a 4K display.

The only reason to consider this lower resolution monitor and use it with DLDSR + DLSS instead of sticking to a high PPI 4K display is simply because of the other features it has. Such as the amazing pixel response times and contrast ratio. So even though DLDSR + DLSS helps improve the image quality greatly, it's still a very clear and obvious sacrifice compared to my native 4K panel in terms of resolution/clarity/detail. But I'm giving that up for better motion, lower input lag, as well as higher contrast, and lack of FALD bloom which was especially bothersome in dark scenes.

Personally, I can say that I will definitely be upgrading to a 5040x2160 38"~ Ultrawide model if/when it comes out as that will give larger screen area as well as PPI of 143 which is 32% higher than this monitor, and only 12% lower PPI than my PG27UQ. But in the mean time, this is the best/only option available.
 
#11 ·
As I commented in the other thread, this monitor has some pretty wicked VRR flicker during unstable framerates. It also has some fundamental scan-like flicker and I occasionally pick up a kind of flicker or shimmer in greys. It's particularly noticeable if you set the refresh rate down to 50-60Hz (as an experiment).

Blacks and generally dark colors are great coming from a prior 34" UW 3440x1440 IPS panel, but the tradeoff appears to be some kind of flickering.
 
#12 ·
The major warning is TĂśV Certification only gives this monitor Eye Comfort for low blue light but not for flicker free.

Even worse, due to the average low brightness, it has to be used with dim ambient light.

The combination of these two is commercially in a grey area where whether or not it results in eye damage in long term use.

I don't recommend this monitor for daily drive especially if a gamer only uses one monitor in a basement for 3 years.

If you have multiple monitors and tend to replace them quite often, this monitor should be probably fine.

Another issue is the monitor has less impressive HDR performance compared to true HDR 1000 monitors. HDR editing won't work well on this monitor. It is strictly a gaming monitor, nothing less and nothing more.
 
#13 ·
The major warning is TĂśV Certification only gives this monitor Eye Comfort for low blue light but not for flicker free.

Even worse, due to the average low brightness, it has to be used with dim ambient light.

The combination of these two is commercially in a grey area where whether or not it results in eye damage in long term use.

I don't recommend this monitor for daily drive especially if a gamer only uses one monitor in a basement for 3 years.

If you have multiple monitors and tend to replace them quite often, this monitor should be probably fine.

Another issue is the monitor has less impressive HDR performance compared to true HDR 1000 monitors. HDR editing won't work well on this monitor. It is strictly a gaming monitor, nothing less and nothing more.
Somehow it got a "flicker-free" certification. Imo the flicker and shimmering around VRR really dings this monitors gaming creds too. It's annoying how near-perfect it is, and then the negatives are so big that they compromise the whole experience. Nothing takes you out of the moment like having some pixels shimmer, or the whole screen flickering
 
#18 ·
It's nice to see they are finally working on decent oled monitors. Haven't been paying attention to computer hardware in about a year so this monitor is a nice surprise.

It's hard to enjoy gaming on my IPS display anymore. OLED just looks so much better and I find myself putting up with controller gameplay just to play on my LG C1. Any time a game turns to night time or you enter a dark area, I cringe at the way it looks on IPS. So it's time to upgrade.

My employer has access to Dell's MPP. I Called their sales team and they offered me 5% off and one day shipping so I placed an order. They explained that this monitor is being made to order and isn't readily available. Next batch is estimated delivery on or before August 8th.
 
#19 ·
The monitor is growing on me. But I do find that in some games, I have to manually adjust the brightness/gamma/contrast settings in Nvidia Control Panel. I don't know why it's so hard to "set it and forget it" with this monitor. My PG27UQ was perfect and rarely needed any adjustments in any games. But with this, I pretty much always have to do per-game adjustments to make it look right. Otherwise I get areas that are too blown out, or too dark, or too washed out.

I'd say that's the biggest problem with the monitor, for me. A friend of mine ended up returning his monitor.
 
#20 ·
Is it really the monitors fault? Saying it changes from game to game seems to indicate it's not your monitor.

I'v had days where I watch several hours of HDR content on my LG C1 oled and gotten used to the way it looks. Then if you switch to SDR content, it's jarring how bad it looks. It looks like your screen just died and everything is faded or washed out. It's not the fault of the display, you just get used to having larger color space and now it's gone.

This is significantly more noticeable on oled screens than any other. I'v had HDR capable IPS and Quantum Dot displays and neither was as noticeable as OLED.

Games being art rather than photos or videos is also another issue. Maybe they chose to have super vibrant colors, while another game will be muted and dark. So one of them looks bad to you, when its quite possible they are both being displayed exactly as intended.

If you set your monitor to sRGB calibrated mode while playing SDR game, there will be no mistake that it's being displayed as intended.
 
#26 ·
I just received my monitor today ahead of time. Estimated was August 5th.

First impressions are this thing is a significant upgrade over my LG 27GL850 variant. I honestly can't stand using IPS monitors after owning an OLED TV and this monitor doesn't disappoint.

Only played one round on BF 2042 so far and it felt very nice to game on. This is my first time using a ultrawide and it brings me back to how I enjoyed 3x 1080p eyefinity on BF3.
The only thing that kind of sucks is game UI not properly scaling. I thought Battlefield was always up to pace on ultrawide and unusual aspect ratios but this game is deficient in allot of ways, so I guess it's to be expected.

Is there any suggestions of which FOV setting works best on 21:9? I usually played 85-90 on my 16:9 to avoid weird stretching / zoomed out feeling. I only tried 90 and 100 on BF 2042 but it doesn't seem right yet.
 
#31 ·
I just got mine and love it. I knew about the VRR OLED Flicker thing before hand (Which every OLED does and it just doesnt bother me as much as bloom) Coming from the Predator X27 it feels like a huge upgrade and that was a pretty good display for what it was
 
#32 ·
rums (sorry not sorry if that's triggering for some of you). So keep in mind that the actual response time/input lag should be better than what most reviews reported if they didn't do
It seems like this is a great thread with some experienced users. If anyone has time to comment please see my thread on some issues I'm experiencing with the AW3423DW. It seems that the input lag sucks on this monitor or I'm using the wrong settings. and yes I am paying on max 175 hz. Not sure if there is any other way to increase input lag.


 
#33 ·
Hello all,

Long time since I visited the forum. I recently purchased an AW3423DWF (I know this is the DW thread) and I was wondering if the pixel refresh on my monitor is on a normal behavior. If I let it sleep, it will surely do the pixel refresh when needed (after +4h of use) and I see the green light blinking, but does this also work if I turn it off on the OFF/ON button? I am not used to letting the monitor idle on standby 24/7, I'd rather turn it off when I go to sleep, though I haven't seen it do the refresh (Or at least the green light does not start flashing) if I turn it off, even at +4h usage after last refresh.
 
#34 ·
Hello all,

Long time since I visited the forum. I recently purchased an AW3423DWF (I know this is the DW thread) and I was wondering if the pixel refresh on my monitor is on a normal behavior. If I let it sleep, it will surely do the pixel refresh when needed (after +4h of use) and I see the green light blinking, but does this also work if I turn it off on the OFF/ON button? I am not used to letting the monitor idle on standby 24/7, I'd rather turn it off when I go to sleep, though I haven't seen it do the refresh (Or at least the green light does not start flashing) if I turn it off, even at +4h usage after last refresh.
You mean the button in the right corner of the screen right? If so yes thats fine, but that does put it in a low power standby mode like most modern electronics

If you turn it off via a power strip or unplug it, then it will NOT be able to do the pixel refresh. The cost in power draw is worth it to not have issues down the line with your display because it didnt do the pixel refresh, see why a lot of OLED tvs in stores end up with more burn in than you'd get at home, because if they turn them off they have like, a big breaker or switch they hit to turn them all off and cut power versus turning them off like a normal user.
 
#38 ·
Just wanted to add that I do have image retention on my display now. Ability/skill buttons from a game I play way too much. I never turn the monitor off and let it go into standby on its own so it can do its pixel refresh. Not as impervious to image retention/burn in as one would expect based on the marketing and warranty. Not bad enough to swap it out under warranty yet.
 
#39 ·
What HDR settings did you use? I don't know for sure, but I would suspect that HDR1000 would be much more prone to burn in than HDR400. As you would be driving those pixels to twice the levels that the HDR400 profile would.

Also to note, if you are just letting the display go into standby mode, you aren't letting the display do it's pixel refresh, unless your display has different firmware behaviour for this than mine. I've noticed standby mode does not activate the pixel refresh cycle, only hitting the power button does, or selecting the refresh cycle when the notification pops up.
 
#44 ·
I just got the DWF version of rhe monitor and I want to know what I should do for maintenance. I usually let me computer turn off after 15 minutes of no use. Is that ok? Also, how do I check for things like dead pixels?
 
#45 ·
There are so many settings on this thing. I am so confused. Everyone who has this monitor has Windows 11 while im here with Windows 10 trying to figure out how to change my settings.
 
#46 ·
I use my 34" ultrawide as my primary work monitor. Typically there would be the Windows task bar, Teams, and WhatsApp largely static, as well as a good chance of browser title bar and/or Excel assets being displayed for 8-10 hours daily. It seems like this monitor has some sort of technology built-in to try to prevent burn in, but does anyone has personal experience with long-term static images in something like a WFH scenario who can share their experience? It seems like for $1,000 the OLED upgrade is worth it over the best high-refresh IPS panel for color, image, and refresh rate/motion. The only real hesitation I have (other than saving up) would be the thought that it will spend 50 hours a week largely displaying static content.
 
#47 · (Edited)
I'm debating between the DW vs DWF. Besides the 10Hz difference and firmware upgradeability, is the GSync module 23DW worth the premium?
 
#48 ·
To me it depends, from testing, the original DW has better HDR tone mapping and etc but whether that matters to you or not is deabateable. Depends on the price difference to me

If you have an nvidia gpu, id go with the DW

if you have AMD, go the DWF
 
  • Rep+
Reactions: chibi
#55 · (Edited)
PSA for the DW

I love this monitor and have used it for around a year now. However, as some of you know sometimes it would intermittently not turn back on out of sleep/standby. The running theory and fix is the firmware which cannot be done locally (this is allegedly due to the gsync ultimate module) so an RMA is needed with Dell. I've spoken about my experience quickly here on this thread and various people in that thread later pages have reported the new firmware has seemed to fix it.


As you all have after getting frustrated by needing to unplug and replug the power cable several times a week and reading this thread and the alleged resolution is with the new firmware. I decided to pull the trigger with Dell Care yesterday. After a lengthy conversation, I managed to get a new one sent out which arrived today talk about a quick turnaround (I'm in Australia, Melbourne)...

My previous defective unit firmware was M0B102 with a manufacture date of May 2022..
The new monitor arrived with firmware M0B204 with a manufacture date of June 2023...

The jury is still out but I'm yet to experience this issue today... I suspected this probably started happening after the first time a full cycle of the pixel/panel refresh was done just my theory but anyway fingers crossed.
 
#56 ·
Hi
Bought this AW3423DW today
Production date August 2023
Does this mean it must have the newest firmware?

And by the way what is the latest firmware?

I did not open it yet, but exited to do so

I am returning the LG OLED C3, main thing is relatively low 120hz refresh rate and 4k is difficult to drive for high fps. But was ok with the TV otherwise.

I have a 3080ti