The admin (PCM2) is on holiday until November 23rd. Posting on the forum will be restricted during this period - no new topics or user registrations are accepted and replies to existing threads will be limited.
Buying a monitor? Please refer to this post before purchasing.
New user? Register here.
- AuthorPosts
- April 28, 2023 at 7:37 am #71410PCM2
I’m glad you’re happy with your second X32 FP. All you’re doing by adjusting the contrast and brightness in Nvidia Control Panel is displacing and messing up the PQ curve of the monitor, under HDR. You may not notice negative effects on the content you’re viewing, especially if you haven’t raised things a huge amount there, but you are crushing shades together and losing shade variety by setting things above 50%. Observe Lagom’s white saturation test, for example, and eventually you’ll see things ‘blow out’ and become more blended if you increase the brightness slider sufficiently. It’s gradual so the effect can be more subtle with a smaller change, but above perhaps 70% you might see some more dramatic impacts. Same with brighter shades on the contrast gradients – and watch what happens to the darker shades there if you increase the contrast slider sufficiently. When you recorded <400 nits with your SpyderX, that's because the software is an SDR application and not an HDR application. If you correctly measure HDR brightness without messing around in Nvidia Control Panel, you'll see that the monitor correctly display things within spec - as you discovered in games.
The Windows desktop is an SDR environment, outside of specific HDR applications (including some games and movies). The monitor is not supposed to be showing such content with boosted brightness levels. If you prefer that much brighter look then you can achieve that, as you've discovered, by forcing the monitor to displace its PQ curve. But the idea is that HDR is something you activate in Windows or your game when you're specifically wanting to use HDR, not SDR. For most people the brightness the monitor provides under SDR is more than enough for normal everyday usage in a range of lighting environments. It's very unusual to want a monitor to exceed ~500 nits - most people set their monitor to 100 - 200 nits. Everybody has their own preferences and sensitivity, but much beyond that I would consider excessive unless the room is really flooded with light. Gamma representation and colour accuracy is also far from perfect when feeding the monitor SDR content under HDR - that applies to any monitor really, not just this one. So my recommendation would be that you should probably just use the monitor as intended for SDR content (i.e. using SDR), your eyes should also adjust to the 'dimmer' look if you give them time and you may notice viewing comfort benefits such as reduced eye fatigue at the same time.
April 28, 2023 at 8:48 am #71413GroundReaper94Good morning everyone,
first of all, thank you for your awesome work @ PCM2.
Second, I would also like to ask the same question regarding the NVIDIA Control Panel method. In SDR (Desktop and old games), I just stick to the SDR mode, but newer games and the PS5 look much better with proper HDR. Here is the problem now, the monitor is extremely dark in HDR mode. I already read all the responses and now that the monitor tends to be more dark biased to reduce blooming. Therefore, the PQ curve is also a joke on the small areas, heavily reducing the brightness of certain scenes.
I also own a Neo G8 and AW3423DWF and all these monitors do not suffer from this behavior. The X32 only gets brighter in the HDR test patterns, but that is not enough for me. I know that the OLED cannot keep up with the larger white areas, but it manages to keep up pretty good in comparison to the Neo G8 and X32.
The PS5 is even worse with the X32. SDR looks breathtaking. As soon as I enable HDR, the screen gets reaaally dim. Almost looks like 100 or even less nits on the home screen. The test patterns in the HDR calibration app on the PS5 look really bright, the end result on the home screen is a catastrophe. Some games look alright, but some are almost unplayable. The worst one so far would be Battlefield 2042. It is so dim that I can’t even identify enemy players on certain maps and environments.
The NVIDIA Control Panel at least mitigates this problem to a certain degree on PC, but also overblows certain elements of the image ingame. The best example would be the BattlePass overview in Modern Warfare 2. The PS5 completely misses out on this fix.
How should I adjust the ingame whitepoint with this NVIDIA method?I am so fed up with all those monitor manufacturers.
The Neo G8 suffers from scanlines, at best average color performance and an aggressive ABL and curve.
The AW3423DWF has a broken HDR1000 mode, only HDMI 2.0 and a potential burnin risk. The picture is great, but I miss the 4k density.
The X32FP has broken HDR, a bad OSD and a bad local dimming algorithm, which can lead to flicker in certain games.Sorry for the very long post, but I had to get out all the things that accumulated over the last few weeks. 😅
April 28, 2023 at 8:53 am #71417PCM2You’re welcome! And I feel your frustrations with the compromises you have to make regardless of monitor. As I mentioned earlier in the thread, the only way that the Acer X32 FP would be able to display those scenes you describe as too dull, with more pleasing uplifted brightness, would be for Acer to re-tune the local dimming algorithm. You can’t overcome the algorithm by calibration in Windows – it always wins in the end. 😉 You’d have to simply make the monitor display shades in a bright enough way so that it bright biases, but then you’d be upsetting the image elsewhere.
Again, Acer is prioritising depth and atmosphere over brightness and that’s the compromise that has been made. With the number of dimming zones and IPS-type panel this makes sense to me, but I still feel it would’ve been nice to have some flexibility to change that for the user or perhaps just a bit less heavy-handed dark biasing for some scenes. Obviously there will be further thoughts in the upcoming review, but don’t expect any way to ‘fix’ this so it’s more to your taste as that would be up to Acer.
April 29, 2023 at 7:26 am #714284KGalaxyI also ran into similar issues with the X32 FP appearing too dark in HDR (see my earlier posts in this thread). Highlights were good, but mid-to-low brightness content was very dark and did not seem accurate. I get why Acer might have done this, to minimize blooming, but I personally thought it was too much. Entire scenes would feel incredibly under illuminated and darker grays would kind of blend together. I did not have test equipment so my numbers are estimates based on some brightness videos on YouTube, but it felt like the range between 1-nit and 400-nit was far too dark. Above 400-nits it seemed great. I tried for quite some time to make adjustments, as I liked certain aspects of the X32FP, but I ultimately ended up returning the monitor due to a number of issues (including bad interlacing artifacts at 60 Hz).
In the end, I decided it was too hard to find one screen that could do it all and I ended up getting a more basic 4K 144Hz monitor for computer use + SDR gaming and a small OLED dedicated to HDR content. It is not ideal but it was still more cost effective than the really expensive mini-LED 32-inch monitors.
I’ve been very frustrated with the options for monitors. Not only are there few 32-inch options with good HDR, but I ended up with a lot of monitors with bad pixels, dust embedded in the panel, major uniformity issues, and other quality control issues. I’ve always been a little picky, but when I shopped for monitors a decade ago I didn’t run into this many quality issues.
April 29, 2023 at 7:06 pm #714344KGalaxyAddition: To be clear – the quality issues I was running into were across multiple brands and often multiple units. It was very frustrating. I only tried one X32FP and it was actually ok compared to some other monitors I tried (no dead pixels, decent uniformity), but the interlacing artifacts, flickering with local dimming, dark mid-to-low brightness HDR, OSD problems, and other issues were what broke the deal for me.
May 4, 2023 at 5:14 pm #71502GroundReaper94Exactly, the HDR brightness is very bad. I am sorry, but their handling of the EOTF curve is a joke. You got dimming zones and a relatively bright panel. Why don‘t you use it? Give us at least the option, I can live with a bit of blooming. My iPad Pro also has got some blooming on high brightness levels, but it looks fantastic.
HDR is literally unusable with a PS5 and on PC you have to use numerous workarounds to fix it to a certain degree, which on the other hand destroys certain color shades.It really looks like that I have to wait until the end of 2024 to get proper 32 inch OLED monitor with 4k and proper brightness. I can live with the very few drawbacks of OLED.
Clearly, almost all monitor manufacturers do not have a clue when it comes to proper HDR. Only possible options now are two very expensive 32 inch monitors with very slow panels and a price tag from another dimension. Also do not forget about the lack of HDMI 2.1 and the GSYNC Ultimate fan.
Another option in the EU would be the Coolermaster GP27U. The thing is, I do not want another 27 inch monitor!
Just give me a proper monitor already with low input lag, good brightness, Mini LED or OLED, working HDR, a somewhat functional 60Hz and at least a maximum refresh rate of 144Hz for the PC, HDMI 2.1 and a 4k resolution with 32 inches. I would easily spent 2000 or more euros for a proper monitor… ☹️
May 5, 2023 at 8:32 am #71507DegraderThe XG321UG and PG32UQX are now cheaper, depending on where you’re living you can buy them for around the 2200,- euros from Amazon. I have the XG321UG and you cannot hear its fan. You can also use the DisplayPort of course for full support of 4K and 144Hz on RGB 444 format. For gaming consoles you can enable 120Hz via HDMI, the quality loss is only visible when sitting near at the screen in gaming, but from a further distance it should not be a problem. The only real drawback I think is the fairly slow response times indeed. I don’t find them too distracting, but I have to say that I’m not very sensitive to ghosting. Also not to a bit overshoot when using the Advanced OD setting. I really like the very good color reproduction, PQ curve and the dynamic look in HDR which I find way better than these on the X32 FP.
There is still no new firmware available, it’s taking a long time now. PCM2, I’m just curious: are you willing to wait for an optional update before publishing the review?
May 5, 2023 at 8:40 am #71512PCM2I won’t be changing my review schedule based on Acer’s firmware release schedule, no. I’ve already performed a lot of testing with the X32 FP using firmware 2.00.010 and won’t have time to ‘re-test’ too many things even if the firmware is released before the review is published. But I wouldn’t expect dramatic changes from upcoming firmware anyway, just things like unlocking ‘Over Drive’ with Adaptive-Sync. The effects of that can already be assessed by using HDMI 2.1 with an Nvidia GPU. Something interesting I’ve discovered with the current firmware is that the screen shows relatively low overshoot levels when you set it to 60Hz, even if VRR is enabled – but if it’s set higher and it’s running ~60Hz because of the frame rate under VRR, overshoot levels are significantly higher. That’s not useful to most people, unless they’re only running content at 60fps maximum with some drops below and want to use VRR, but still an interesting observation.
Speaking of observations, I agree with a lot of the observations you’ve made and various points made by others in this thread. I feel some of the decisions Acer has made here with the OSD and in some cases the tuning of the product are frustrating – some will really love the experience nonetheless, but for others it just won’t quite be what they’re after. I’ve certainly been getting quite mixed vibes in my testing so far, but it’s probably one of those monitors I’d recommend people give a chance and try for themselves. The review is still a few weeks away from being published but I think it will make for interesting reading or watching.
May 5, 2023 at 8:58 am #71513DegraderThat’s strange! It looks like the monitor uses a different OD algorithm when set at a higher frequency? So the panel is able to show minimal overshoot at around the 60Hz. It is only Acer which making strange decisions 🙂
May 5, 2023 at 2:18 pm #71515sblantipodi@ PCM2 Acer doesn’t sent me the firmware yet so I think that we are out of time for the review as you said already.
The issue you described with overshoot in the 60Hz can be explained by the fact that this monitor does not have variable overdrive? doesn’t it?if you set the monitor at 160Hz, it uses the 160Hz overdrive algorithm always, at 60Hz and at 160Hz,
if you set the monitor at 60Hz it probably uses another algorithm for the overdrive.May 5, 2023 at 2:22 pm #71519PCM2You’re correct that the monitor lacks variable overdrive so doesn’t re-tune its pixel overdrive for lower refresh rates. But on most monitors that lack of re-tuning would apply to a static refresh rate as well. The pixel response behaviour is usually quite similar at a static 60Hz vs. if 60Hz is reached via VRR and the monitor is set to a higher refresh rate, whereas in this case it’s significantly different – and that is unusual.
May 5, 2023 at 2:25 pm #71520sblantipodiI think that they tried to improve the solution when using 60Hz for the consoles but lacking the variable overdrive they aren’t able to do it when using 160Hz.
I’ll write them to report this issue. I think that they need to answer on this 😀
May 5, 2023 at 4:46 pm #71522GroundReaper94Definitely a really good monitor so far, but it has got some quirks.
I really like to play a lot of shooters and a halfway decent response time is a must for me. I also need a great monitor for all my singleplayer games with all details on high combined with good HDR and Raytracing.
The thing is that there are almost no decent options for 4k, 32 inches and MiniLed/OLED.
The PG32UQX and XG321UG are great for HDR and singleplayer, but have got too much smearing (slow response times) for shooters.
The INNOCN 32 inch monitor also suffers from a really slow panel, the 27 inch panel is fine though.
The Neo G7/G8 have got great motion performance, but a rather lackluster HDR implementation, an aggressive curve, mediocre colors and scanlines.
The AW3423DWF is my current monitor in use and is a phenomenal piece of technology with perfect blacks, great response times and a really nice glossy panel with vivid colors. I only miss the 4k resolution here.The X32 FP was almost my endgame monitor because it delivered on almost all fronts until a proper OLED could replace it in the future with all the features of the X32 FP. Only the broken HDR implementation scared me away and I know that the panel can do a lot more in HDR. That makes it so sad. I just hope that Acer can at least retune the PQ curve a bit.
For testing, I ordered a Cooler Master GP27U. It is only 27 inches, but is otherwise almost identical to the X32 FP. So far the brightness in HDR looks very promising on the GP27U according to numerous reviews. If it can deliver a decent experience, I will maybe stick to that monitor. Otherwise, I will have to choose between the AW3423DWF or X32FP.
The suffering we have to go through is really sad, but we will someday find our perfect monitor. I believe in it! 😁
May 5, 2023 at 4:47 pm #71523sblantipodiPCM2 can you anticipate something on what you think about the HDR performances while comparing it to 1152 zones monitor?
May 5, 2023 at 4:54 pm #71527PCM2Yeah, the overall HDR implementation (aside from bright-scene performance where bright shades dominate) is very solid indeed on the AW3423DW and it’s the model I use as one of my main monitors when not reviewing or for reference when I am. I also much prefer a higher pixel density, however.
As for the X32 FP, I agree they could’ve fine-tuned things better even with the relatively low dimming zone density it has. If they went for the level of bright biasing of the PG32UQX and XG321UG, with half the dimming zone density, I don’t doubt the result would’ve been rather messy. Somewhere between those models and the what they went for with the X32 FP would’ve been good in my view, or at least an option to ease up on the dark biasing. Still, it depends what you’re after from your HDR experience and I view what the Acer offers as vastly superior to most LCDs under HDR (even if it’s still compromised in some ways). Anyway, that’s the last I’ll say on this and will save anything further for the review. 😉
May 5, 2023 at 6:16 pm #71528spirohThe panel is really good even without the dimming zones. However, the Acer implementation is bad.
May 5, 2023 at 6:20 pm #71530PCM2Yet unfortunately for those seeking alternatives, the Acer implementation is the only one that exists. It’s around half the cost of the PG32UQX (based on US pricing), for example, and there is no model that compares to it apples to apples based on price and performance. Some people will (and I know from this thread and direct email communication, some do) enjoy the performance the X32 FP offers, even with this implementation that others will consider too compromised. All monitors are compromised in one way or the other, it’s about picking which compromises you’re happy to live with.
May 5, 2023 at 6:50 pm #71531spirohI agree not many alternatives and I am enjoying the performance of the monitor which is why I kept it but it does have its issues. It’s a matter if you can live with them or not. I have the dimming function disabled because I do not like how it looks when not gaming (which is more a technology thing rather than an Acer one) but I still enjoy it for gaming and productivity.
May 8, 2023 at 11:44 am #71551PCM2Testing HDR on the X32 FP using both my AMD and Nvidia GPUs has revealed some significant differences. The AMD GPU bright biases a lot more – actually quite a bit too much in my view, as darker shades and some medium shades lack appropriate depth. There’s still a huge benefit compared to disabling Adaptive-Dimming, but I’m much more aware of the limited number of dimming zones there and ‘haloing’ is more common (though not extreme in most scenes). On the flipside this means some scenes look more lively and you get less of a dragging down of some shades which can give a duller than intended look on the Nvidia side. If I enable ‘AMD FreeSync Premium Pro’ it raises black depth significantly and gives a clear washed out look under HDR to many scenes. ‘Adaptive-Dimming’ under SDR is exactly the same on the AMD and Nvidia side.
Now my issues is, I’m using an AMD Radeon RX 580 which is a dated GPU. It doesn’t support DSC so can’t fully support the monitor. It’s certainly possible what I’m observing with ‘FreeSync Premium Pro’ enabled is just the GPU messing everything up, or it could be that the AMD FreeSync Premium Pro pipeline is kind of messing things up on this monitor. So for AMD users with the monitor. Here’s where you come in. If you set the monitor up as you normally would under HDR (making sure it’s running an appropriate HDR preset, have ‘AMD FreeSync Premium Pro’ active in the OSD, ‘Adaptive-Dimming’ at your preferred level for HDR and HDR on in Windows). And then turn the monitor off then on again so it re-establishes signal and toggle ‘AMD FreeSync Premium Pro’ on and off in the graphics driver. Do you notice a difference in how it represents things? This should be apparent even in the SDR environment of the desktop so as soon as you toggle the setting on or off. If not, do you notice any difference observing the black background of this HDR brightness test (running in HDR on Edge or Chrome)?
Edit: With FreeSync Premium Pro disabled on my AMD GPU, it’s very similar to the Nvidia GPU. So it’s just actual ‘FreeSync Premium Pro’ (i.e. VRR active on AMD GPU) that messes things up.
May 8, 2023 at 12:24 pm #71552GroundReaper94I read somewhere that disabling DSC on the monitor (RX580 is basically doing that) can improve the HDR performance, but that would be a really bad sacrifice for a monitor that expensive.
Maybe there is more to it and AMD handles things differently. Maybe like the HDR1000 mode on the AW3423DWF with AMD and NVIDIA GPUs where the tonemapping is handled differently. Would be really nice to know. Thanks for the information @PCM2 - AuthorPosts
- You must be logged in to reply to this topic.