The admin (PCM2) is on holiday until November 23rd. Posting on the forum will be restricted during this period - no new topics or user registrations are accepted and replies to existing threads will be limited.
Buying a monitor? Please refer to this post before purchasing.
New user? Register here.
- AuthorPosts
- July 1, 2022 at 10:13 am #68937TetriX
Hi there,
I currently use the LG 34GP83A which for me replaced the older LG FreeSync UltraWide you reviewed. I really like the 21:9 aspect ratio and I’m happy with the pixel pitch. But your review of the Alienware has me really salivating at the thought of a QD-OLED version. That contrast. Those viewing angles and colours. The responsiveness. Give it to me! π
I understand MSI has their own version coming out as well, the MEG 342C QD-OLED. I’m wondering how you think this will compare to the AW3423DW? I know only the Alienware has G-SYNC and the MSI is FreeSync with G-SYNC support, but does this really matter? At this point I suppose I would go for whichever I can get and would be super happy with either model. Maybe later this year when I’m ready to make the switch. I know Samsung is supposed to have their own version (Odyssey G8QNB?) but I don’t really trust their quality control and just personally prefer the other manufacturers.
Any advice you can give? Thanks!
July 1, 2022 at 10:47 am #68939PCM2Hi TetriX,
I’m glad you’re enjoying the 21:9 experience so much. That’s quite some dedication to want to go that route for your third monitor but it just goes to show, some people really do enjoy 34″ ultrawides and are happy to stick to that. I agree that QD-OLED would be a significant upgrade for you from your current IPS model. You can expect much stronger contrast (especially in dimmer lighting), more direct emission of light from the glossy screen surface, improved viewing angle performance and colour consistency plus of course upgraded pixel responsiveness. And a far superior HDR performance which puts a lot of these advantages together, with higher peak brightness. Of course you’ve meticulously read and watched the review, so you’ll know all that. π
As for how the AW3423DW would compare to the MEG 342C, it’s tough to say without testing the latter or even without it released (lack of data from other reviews and users). In fact they don’t even have a full spec sheet or product page available yet, which is why we don’t currently have a news piece up for this one. Having said that, you can certainly expect many more similarities between the two than differences. There are some obvious aesthetic differences – you may prefer the contrasting black and white look of the Dell Alienware or alternatively the black with golden touches of the MSI. You can of course see the Alienware from various angles in the review, with the MSI news piece showing a front and rear shot of the MSI.
You’re correct that the AW3423DW is a G-SYNC Ultimate model, which includes a G-SYNC module which can be used via G-SYNC as well as Adaptive-Sync (including FreeSync). The main advantage of this with an LCD that people would tout would be variable overdrive, with careful re-tuning and optimisation for a range of different refresh rates. Avoiding excessive overshoot as the refresh rate drops (and hence frame rate drops under VRR). OLEDs are natively extremely responsive, so this shouldn’t have an effect and isn’t really an advantage for one or disadvantage for the other. VRR flickering on VA models is generally lower with G-SYNC modules as well due to tighter voltage regulation. But the AW3423DW still had a form of VRR flickering even with the module – I wouldn’t expect the MSI to be better or worse in that respect (edit: but it could be) and this flickering isn’t intense or frequent enough to bother everyone. Some other slight and to some entirely negligible advantages to the G-SYNC module are covered in the review. There’s seamless operation throughout the VRR range, without a defined LFC (Low Framerate Compensation) boundary – which is likely to be 48Hz on the MSI. There’s slight momentary stuttering if that boundary is crossed in either direction. It also reduces some instances of micro stuttering, but that’s anecdotal, highly subjective and can also be caused by various things which G-SYNC can’t ‘fix’.
Some advantages I can see for the MSI, as it lacks the module, include an assumed lack of cooling fans which I know will please some people. OLED screens don’t pump out enough heat to require passive cooling like this, but the G-SYNC module does include such measures. It also has the potential to be a bit cheaper due to the lack of module. Though I don’t expect them to throw in a 3-year warranty with ‘burn-in’ cover like Dell did. I still haven’t had any particular issues of this nature with the AW3423DW despite some extensive use of the monitor after the review, incidentally. On some MSI models they also have multiple gamut emulation settings (including not just sRGB but also DCI-P3 and Adobe RGB), but whether that’s implemented and how much flexibility you’d have with such settings if it was implemented on the MEG 342C remains to be seen. So overall, I’d say ‘wait and see’ and perhaps go for whichever is available to you or depending on your local pricing. As you say, you’re likely going to be happy with either model.
July 1, 2022 at 5:16 pm #68947TetriXWow. Thanks for such a detailed reply! I will heed your advice and base this decision on pricing and which model I can actually get later in the year. Do you plan to review the MSI?
July 1, 2022 at 5:21 pm #68949PCM2It’s a possibility, but as usual it depends. It will need MSI to be able to provide a sample, which hopefully they can at some point. But also that the monitor is readily available. I did things a bit differently with the AW3423DW as I purchased the monitor myself with the intention of using it as one of my own monitors if I liked it enough (I did). I wouldn’t usually focus on models which can’t be purchased in a way that supports our work, either, but in this case I had a strong personal interest in the monitor and with it being the first showcase of a new technology felt it really needed to be reviewed. π
July 6, 2022 at 1:05 pm #68983Nikorasu081One thing to note is that MSI is a heavy player in the console marketplace. MSI has a console mode which basically takes a 4K signal and is able to downscale it to 1440p. This will be perfect for me as someone who also has a PS5, which doesn’t support 1440p natively. https://www.msi.com/blog/msi-monitor-console-mode-designed-to-answer-your-concern
July 6, 2022 at 1:10 pm #68985PCM2That’s a good point. It’s likely the MEG 342C will include a ‘4K’ UHD downsampling mode, which has been included on many of MSI’s other models including ultrawides. Not only would it be useful for the PS5 which can’t leverage a 1440p signal, but also for the Xbox Series X which can only run under HDR at the ‘4K’ UHD resolution. So users of that console could enjoy the HDR capabilities of the monitor, which isn’t possible on the AW3423DW.
September 30, 2022 at 5:20 pm #69495PCM2Another interesting option to be released in the near future is the Dell Alienware AW3423DWF. Like the MSI, this one features Adaptive-Sync support without G-SYNC module so it does have some disadvantages compared to the ‘DF’ as covered in my first post in this thread. On the plus side it is thinner and lighter than the AW3423DW, is cheaper and shouldn’t have an active cooling solution. There are some aesthetic changes such as darker matte plastics (‘Dark Side of the Moon’) in place of lighter matte plastics (‘Lunar Light’) and it seems the ‘Alien FX’ LED ring has been dropped and there’s an illuminated ’34’ instead. It has 2 x DP 1.4 input and 1 x HDMI 2.0 whereas the ‘DW’ features 1 x DP 1.4 and 2 x HDMI 2.0. It tops out at 165Hz rather than 175Hz, but this shouldn’t make a significant difference in practice as it’s only 10Hz. It also accepts a ‘4K’ UHD signal at 60Hz (downsampling), so you can run the Xbox Series X with HDR whereas you can’t do that on the ‘DW’. π
October 24, 2022 at 5:16 pm #69710PCM2Another model worth a quick mention here is the Philips Evnia 34M2C8600. It doesn’t currently have an official product page up, but it will be using the ~34″ QD OLED panel and will be styled similarly to the 34M2C7600MV (Mini LED ultrawide model). The RRP I have in the UK for the Philips QD OLED model is Β£1479.99, so the price seems a bit on the high side in my view based on my understanding that it will be one of the models without G-SYNC module as well. We’ll see if there are any special features or surprises when the full product page and specs are available.
December 15, 2022 at 11:22 am #70203PCM2It has been confirmed (refer to Monitors Unboxed review) that the AW3423DWF does actually include an active cooling solution. But apparently it’s very quiet and unobtrusive – though personally I have no issues with the fans of the AW3423DW, either, and my preference would be for them to be eliminated completely if anything. The Monitors Unboxed review also confirms a reduction in input lag, with signal delay reduced from ~4.7ms to ~0.3ms. Based on feedback it seems most gamers are actually perfectly happy with that aspect of the G-SYNC variant, but for those sensitive to this aspect it’s nice to see it minimised.
Aside from the differences already mentioned, the performance of the two models seems very similar and slight differences are within expected for variation for different units of the same model let alone different models. An exception is the inclusion of a greater number of HDR modes to choose from (most just upset the image and go against intended output). The ‘HDR Peak 1000’ mode was worse calibrated on the ‘DWF’ than ‘DW’ in Tim’s testing, often too bright, but Dell were made aware of this and will hopefully address it in a firmware update.
January 20, 2023 at 3:37 pm #70376PCM2VRR flickering appears to be more of an issue on the ‘DWF’ vs. ‘DW’ as well and also the Samsung Odyssey OLED G8 according to early reports. Some have also observed ASBL (Automatic Static Brightness Limiter) behaviour on the G8 OLED which the Alienware models don’t show. That means the screen may dim if a sufficient amount of static content is detected, which can often occur on the desktop – it apparently still stays ~150 nits in its dimmed state.
February 20, 2023 at 2:11 pm #70558PCM2News piece for the MSI MEG342C QD-OLED now available based on MSI’s product page.
March 16, 2023 at 11:21 am #70985Miracle MikeHi!
I’ve been using the AW3423DW for about two months now and yesterday after doing some research about the “G-sync Ultimate” certification which that monitor has and a question came to my mind:
Is the HDR implementation (pipeline?) different when using G-sync vs G-sync off? I was trying to test this yesterday with my RTX 4090 and the HDR peak 1000 mode on the monitor by switching G-sync on and off while playing Witcher 3 and I swear I noticed differences between bright highlights ie. Torches. The differences were quite minor but still there.
Has anyone else noticed this or am I just imagining things? I probably am but I couldn’t find really any actual tests about this subject. So the main question is:
Does the monitor or the gpu handle HDR differently while using G-sync?
March 16, 2023 at 11:45 am #70987PCM2The HDR tuning on the AW3423DW should apply in much the same way whether G-SYNC is used or not and whether you’re using an AMD GPU (with or without Adaptive-Sync active) for that matter. I haven’t observed or measured any specific differences. There may be some slight changes in ‘gamma’ or shade brightness with VRR active and doing its thing – the same ones which cause ‘VRR flickering’. On the AW3423DW these are focused on dark to medium shades and shouldn’t affect the highlights themselves and how the monitor is displaying them, though there could technically be some perceived differences related to that.
March 24, 2023 at 10:22 am #71090Miracle MikeHi,
Okay yeah I thought so since it would really not make much sense for the HDR tuning to be any different depending on the VRR solution.
I also noticed that the flickering that I have been getting has a lot to do with the HDR implementation on Windows 10. For some reason, the HDR picture flickers more if you don’t turn the Windows HDR setting off everytime you shut down your pc… Also I think the original “issue” in my case was just due to the perceived smoothness and image clarity that G-sync allows comparing it to G-sync OFF.
Thanks for your reply!
March 26, 2023 at 6:54 am #71100dr mepeGreetings,
I’ve been looking at various monitors that use this QD-OLED panel. Right now I’m testing out the Samsung S34BG85.
This S34BG85 model is advertised as “G-Sync compatible” but with the latest drivers on a GTX 1080 Ti over DP 1.4, Nvidia Control Panel lists it as “not validated as G-SYNC Compatible.” This was corroborated when I found the G-Sync Pendulum test to be a stuttery mess. No settings I’ve found in the monitor OSD seem to enable VRR / G-Sync compatibility.
Is this a typical experience of ‘G-Sync compatible’ monitors using this panel? If so, I would expect this issue with the MSI and also the Philips. This particular problem is making me lean towards the Alienware DW model for its G-Sync Ultimate certification.
March 26, 2023 at 7:08 am #71103PCM2Hi dr mepe,
The Samsung OLED G8 is not certified as ‘G-SYNC Compatible’, but that doesn’t mean it shouldn’t give a decent experience whilst using it. Samsung ironically went all out with their ‘FreeSync Premium’ certification and branding, although their monitor gives a weak HDR experience for AMD users (as confirmed by Monitors Unboxed). When I had a GTX 1080 Ti I used to have various issues with ‘G-SYNC Compatible’, but this was greatly reduced when I upgraded to an RTX 30 series GPU. Are you sure you’ve activated the technology fully (ticked the ‘Enable settings for the selected display model’ box) and that the G-SYNC Pendulum has ‘G-SYNC’ ticked at the top when you run it? It’s also worth clicking ‘FPS Sliders’ at the bottom left and changing the range to something like 70 – 175fps to see if it appears smoother. It could be just because it keeps passing the LFC boundary with your settings and you’re sensitive to the momentary stuttering when it does.
As an aside, now that most of these QD-OLED monitors have been professionally or ‘user’ reviewed I maintain my preference for the AW3423DW and I continue to recommend that model specifically. As covered further up this thread and summarised in the conclusion of the review, they should all give a rich and rewarding experience but the module can provide benefits to those highly sensitive to micro stuttering (not the sort of more obvious stuttering you seem to be describing) with some other benefits: “Whilst the G-SYNC module of the βDWβ Alienware model imposes some restrictions (OSD features, lack of β4Kβ downsampling and HDMI 2.1) it also offers a slightly smoother VRR experience. In addition to the advantages noted with respect to the module in this review, the models without G-SYNC module seem to suffer from more pronounced VRR flickering and donβt have the same tight HDR tuning. “ Knowing what I’m personally sensitive to, I wouldn’t swap my own AW3423DW for one of the other models – and I don’t find the apparently louder fan on this one at all annoying or generally noticeable, either, though I firmly believe Monitors Unboxed got a particularly poor sample in that regard.
March 26, 2023 at 4:24 pm #71105dr mepeThanks for the response, PCM2! I did a bit more testing after reading what you wrote.
Regarding my S34BG85:
Previously the “Set up G-SYNC” option showed in NVIDIA Control Panel (NCP). I had ticked that “Enable settings…” box you mentioned but was unable to check the G-SYNC button inside the Pendulum demo. Fiddling with the monitor OSD settings, I’m sure I’d screwed something up because the “Set up G-SYNC” option became missing in NCP.
So, now I’ve done a reset in the monitor OSD which wiped all my changes (though it’s retained the software update I did). Now that “Enable settings…” box is ticked in NCP and the Pendulum demo works as expected with the G-SYNC box checked and smooth performance. It looks fantastic, actually. This works on both 120Hz and 175Hz. I’m planning to test some games this week, and I’ll also have access to an RTX 4070 Ti to check out HDMI 2.1.I’m still not sold on the S34BG85 for $1500. I notice intermittent flashing and flickering, especially after opening/closing a fullscreen app. Additionally, I personally find the ‘smart TV’ interface obnoxious and confusing. The former issue possibly comes back to what TetriX mentioned about Samsung quality control. The mini/micro ports are also an annoyance for me because of my particular setup.
So, I’m still looking forward to the MSI and Philips models… if they come out soon. Given what you’ve written about the Alienware DW, I reckon I should check that model out. I skipped over it in the store because the Samsung on display appeared much brighter and glossier, though there were probably ambient lighting differences and/or modified display settings.
Regarding ports:
I’m only able to achieve 175Hz if color is dropped to 8bpc using the provided mini-DP cable with my GTX 1080 Ti. In the computer store, they had 175Hz @ 10bpc on the displayed Samsung using the same cable plugged into a 30-series GPU. The Alienware DW model was connected via DP to another 30-series GPU but could only get 175Hz @ 8bpc (or 120Hz @ 10bpc). I’m clueless as to why this is the case.March 26, 2023 at 4:32 pm #71108PCM2Thanks for sharing your feedback on the Samsung – glad you’ve managed to sort the G-SYNC thing, but it is indeed a shame about those other annoyances. To get 10-bit at 175Hz requires DSC, which the AW3423DW doesn’t support (and your GPU doesn’t support, regardless of monitor). In practice this is very much a non-issue, unless you specifically need to view 10-bit content under SDR. For HDR, GPU dithering is used to excellent effect, as covered in the review, providing 10-bit at 175Hz. For SDR the vast majority of content you consume is 8-bit and there is no advantage to running the monitor in 10-bit, it’s mainly content creators who may wish to use 10-bit there.
April 8, 2023 at 7:37 am #71257dr mepeMSI did a stream the other day to discuss the MEG 342C: https://www.youtube.com/watch?v=EiJFfD749o8
I believe the price they mentioned was 1499EUR. It’s not out in USA yet (as far as I can tell), otherwise I’d grab one for testing.
A quick update about the Samsung S34BG85 I’ve been using. I’ve noticed when HDMI is used, no matter what refresh rate I set, I will always experience the screen going black intermittently during games. This happens in multiple games using a 48Gbps HDMI 2.1 cable off Amazon with a RTX 4070 Ti. Additionally, DP also has issues when used at 175Hz: the bottom 1-2 rows of pixels appear to be random noise (rapidly changing rainbow colors seemingly unrelated to the displayed content). This effect toggles when a game menu is opened or closed but can also toggle during normal gameplay. This issue goes away when I drop to 120Hz. Finally, I was unable to get USB-C video to work with either a Dell XPS laptop or a Steam Deck, though I didn’t try debugging for more than a few minutes. It’s possible I didn’t configure things properly, but both devices have outputted video over USB-C on other setups before.
With these issues, I’m planning to return the Samsung shortly. I hope I can find the MSI model soon as I would prefer that over the Alienware for the HDMI 2.1 ports and USB-C PD.
April 8, 2023 at 7:39 am #71261PCM2That sounds very annoying. If only Samsung focused on getting the basics right instead of integrating all of this ‘Smart TV’ junk most users have no interest in!
As for the MEG 342C, it was available on Amazon briefly for ~$1100 (which is where the price given in the news piece was from) but is temporarily out of stock. I’m not sure if that was an initial promotional price or the usual US sale price.
- AuthorPosts
- You must be logged in to reply to this topic.