Ultrawide 34-49 inch for gaming and productivity

Viewing 20 posts - 1 through 20 (of 28 total)

Buying a monitor? Please refer to this post before purchasing.

  • Author
  • #60162

    Hi guys,

    I have tried to research this for days now and I’m just making things probably more complicated than I need too.

    Basically during the day is productivity, coding, PS, websites etc, night time is for fun so gaming it is!

    I have an Alienware m15, rtx 2070 super, 32gb ram, 1tb and want to replace my desktop with this machine instead and have at least 2 displays operating, they have to be running at a decent refresh rate even in PBP mode which I know for example is throttled to 60hz on the new 49 inch monitor (apparently).

    Some examples I’ve looked at are:

    27 – VG27AQL1A
    would be nice since its 10bit but cant find any to buy but would buy x2 screens if I did it this way, 1 gaming, 1 work…

    34 – LG 34GN850

    49 – C49G95T
    True 10 bit SVA (Not IPS)

    The other thing was 10 bit panels, maybe a thing worth trying to get? Or at worst the fake 10 bit panels, can you at least turn off the 2bit temporal dithering and have it as an 8 bit, my aim is to reduce eye strain and so far I know that IPS, high refresh helps, no so sure of the TD but I read it does impact a little. I’m not sure I’m missing out on details regarding brightness or the type of backlighting.

    The ideal would be true 10 bit / 1PS / 144hz + / 100% sRGB / 4k / HDR 1000 but its looking unrealistic, the G9 does tick a few of those but I’m worried if its too big and I cant get used to it, or for gaming its just crazy. The 21:9 aspect ratio looks fantastic, thanks for the research you’ve provided us guys. But I’m all ears, not set in stone any any sizes yet.

    In gaming is ms of <2 important to avoid motion blur?

    thanks in advance


    Hi wipe0wt and welcome,

    Good topic, but as with many things in the world of monitors the decision is largely subjective. I personally prefer having a single large screen which I can use for all necessary tasks. I don’t like switching between screens for gaming and productivity, although for multi-tasking multiple monitors can be more efficient and simpler. If you’ve got a multi-display setup I find it annoying if the monitor I’m not using for the main task is central in my field of vision (especially for gaming). That’s just my subjective take on things, not everyone shares those views.

    ~49″ VA thoughts and ’10-bit’:

    Regarding ~49″ VA models, I’m not personally a huge fan of such large models with that panel type. This panel type is prone to gamma and saturation shifts when comparing the centre of the screen to regions closer to the edges. With a screen as expansive as ~49″ 32:9 (like 2 x 27″ 16:9 models sandwiched together) these shifts are significant and noticeable. The 1000R curve can only go so far to offset these on a model like the C49G95T. I’m not sure where you got the idea this was a ‘true 10-bit’ panel from. I haven’t seen anything substantiating that. It isn’t specified in that way – just because dithering or FRC isn’t mentioned by a manufacturer or other resources don’t assume it isn’t used. It’s not usually mentioned, especially not by Samsung. The C49RG90 was an 8-bit + FRC model, there are relatively few ‘true 10-bit’ models out there and that’s because there’s little incentive for the manufacturers to do this. The consumer doesn’t really benefit much if at all from that. From the perspective of viewing comfort, it would be very low down my list of considerations. The dithering is handled very finely indeed and for most content you’d be using an 8-bit signal anyway (which shuts off most of the dithering stage). In practice it doesn’t completely shut it off, but what remains is nothing of concern. The tiny little brightness fluctuations that occur for some shades are not to be confused with the full on cycles of PWM for brightness regulation.

    34″ IPS option:
    I’m more of a fan of the 34″ IPS-type models in comparison. We do currently recommend the LG 34GN850, incidentally, and are chasing LG for a sample. They seem a bit overwhelmed with the current situation and unable to supply one, so I may consider other avenues to acquire one. It’s a very interesting model I feel and probably the best all-round UltraWide you’ll get for that sort of price. Even though I have no hands-on experience yet, I do gather considerable user and reviewed feedback and research such models very heavily. And I feel it’s a nice upgrade from the previous generation, as covered in this comparison thread. The richness and saturation levels are properly maintained throughout the screen and I feel 34″ 3440 x 1440 is a nice resolution for gaming. It’s not super-difficult to drive at high frame
    rates. And I personally feel it gives you a decent enough workspace for productivity purposes on the desktop. It’s now relatively widely supported in games and works nicely for some video content as well. 32:9 is more ‘niche’ in comparison.

    Additional thoughts on C49G95T:
    A few points about the Samsung Odyssey G9 model which you may or may not be aware of. It has been more widely tested in Asia than elsewhere due to availability so this is based on feedback from non-English sources:

    – An (apparently common) complaint is that the bottom bezel seems to ‘peel away’ to reveal light leakage in this area (one example – requires translation, there are others). Samsung reps apparently suggested this was due to the strong curve of the monitor and can’t be avoided. Doesn’t seem very reassuring and I hope this isn’t a widespread issue.

    Some image retention has been observed just from normal desktop work. Not sure if a very high brightness was used or the unit was faulty, hopefully not a widespread concern but worth keeping an eye on.

    – ‘G-SYNC Compatible Mode’ (presumably also applies to FreeSync) can only be used 32:9 full screen, it isn’t possible to use it at the same time as 21:9 or 16:9 modes.

    – 21:9 resolutions apparently display with black borders at the top and bottom as well as the sides. Not just at the sides as you might expect (i.e. 5120 x 1440 to 3440 x 1440 should only require black borders at the side, vertically it should be untouched ideally).

    – There are 10 dimming zones. It’s nice having local dimming available, but this is a very limited number of zones for such a large panel. Same as the C49RG90. Couple this with the high brightness levels under HDR in particular and you can forget atmospheric dark scenes. But there are still some situational contrast advantages – the LG has no local dimming, in comparison.

    – Contrast is sacrificed for responsiveness. Pixel responsiveness appears to be very impressive, especially for the panel type. But contrast has taken a hit. I’ve seen some measurements <2000:1 - not terrible, but eating away at a key VA advantage. And it seems the specified 2500:1 might've been a bit ambitious.

    - Good luck getting the frame rates to make good use of 240Hz @ 5120 x 1440. This monitor looks more like a box-ticking exercise than most out there - some people will love this for the feeling of being 'future proof' but in reality there's no such thing when it comes to monitors.

    Final thoughts, and specified response times:
    I’m not trying to ‘rubbish’ the product, especially as I have no experience with it myself. But I need to make sure you and others reading this look past the specifications on paper. The 34GN850 is just a more polished product in my opinion. Speaking of on-paper specifications and looking past them, that doesn’t apply to anything as much as it does specified response times. They are completely misleading and you should pay little attention to them. We always reinforce this in both our news pieces and our reviews. To get a good idea of how a monitor performs you really need to rely on reviews like ours, you’ll often see there’s limited correlation between specified response time and actual performance. It’s also important to appreciate the concept of ‘perceived blur’, which we explore in our reviews and in detail in our responsiveness article. Response time is only a minor contributor to this on modern sample and hold monitors, as we explore. So even if the figures specified by manufacturers accurate (which they almost invariably aren’t), they wouldn’t necessarily make the difference you might expect.


    Wow, I’m truly blown away by this!! Thanks a million times over, fascinating depth and source material (which I read).

    I’ll just consider the 34GN850 out of those above although getting hold of one of those isn’t easy right now!

    Thanks for the note regarding FRC, I didn’t realise the G9 49″ wasn’t a 10bit panel given the DS site was advising it….

    Just before we’re all set on this one, I made this little comparison just with the slightly larger versions vs the 34GN850 would you consider any of the slightly larger or do these higher resolutions make little practical sense? Thanks again

    Edit: If none of the larger ones are any good, the only fallback I could see similar to the LG 34 was the Dell Alienware AW3418DW.


    When DisplaySpecifications doesn’t mention FRC specifically, it doesn’t mean it doesn’t feature. Simply that they couldn’t verify that it definitely exists (usually because the manufacturer hasn’t specified it or the panel is relatively new and they don’t have access to the spec sheet for it). 8-bit + 2 bit FRC is still ’10-bit’ so it isn’t incorrect to call it that without mentioning FRC.

    I don’t have any specific feedback on the 38WN95C to share. The 38GL950G offers strong pixel responsiveness, similar to the 34GN850, but contrast is somewhat weaker in comparison. I think the main thing to focus on would indeed be the size and resolution. It will appeal to some, especially for productivity. For games it’s quite widely supported, usually games that run properly at 3440 x 1440 will run properly at 3840 x 1600. The resolution is somewhat more demanding, so there’s that to consider.

    The Dell Alienware AW3418DW is two panel generations behind the LG 34GN850. The pixel responsiveness isn’t as good, although it has a G-SYNC module which means it performs relatively well at low refresh rates with variable overdrive (if your frame rate dips a lot you don’t get extra overshoot). It’s limited to 120Hz and uses an overclocked 100Hz panel – there are some issues with the overclock such as ‘dynamic interlace pattern artifacts’ (reference explaining what this means). It has a significantly narrower colour gamut. A better comparison with the 34GN850 would be the Dell Alienware AW3420DW or LG 34GK950F – both are one generation behind it. So they share the wide colour gamut, for more vibrant and saturated output. But pixel responsiveness isn’t as good for the higher refresh rates and you don’t get the 160Hz capability which is a nice little bonus. If you can drive your frame rate that high! All three models are mentioned and briefly compared in this thread, it focuses mainly on the two LG models.


    Just on the more demands for pixels, with the 1600 if this was proving a little much could the display resolution be lowered to 1440 but have some borders? That said, outputting to my LG55 on 2560 × 1440 without an issue. Daft question maybe, but could you not turn the 49 inch into a 38 inch or less and just have borders? I say this because it seems the HDR is much better on this monitor than the others, maybe thats worthy of a consideration?

    I like the idea of the the 38 inch as its the spot between the 49 behemoth and the 34 but not at the risk of losing quality, the most important consideration now is availability however and finally owner reviews which incidental amazon will have which fits nicely with the last part being to check via Amazon given your affiliation:

    short’er’ list
    34GN850 £1199 currently only available from the 3rd party sellers who have slapped a £200 margin on top.

    Dell Alienware AW3420DW – £999 very favorable reviews, quite popular model.

    LG 34GK950G (not an F not sure the difference) £1799 Good reviews but for this price being overclocked 120hz isn’t on my list

    The 38GL950G has the gsync module but its £1799!

    I think its between the Dell Alienware AW3420DW vs LG 34GN850, the AW would be aethestically an amazing match as my laptop is AW and also lunar light. Bit thats not a major thing, the fact the AW has a gsync module for around £1000 seems good value but its 120hz. The ports are older but its 120hz so fine I guess. So far I’ve outputted to my LG55 on ultra settings hitting over 120fps on games like SOTR, control with RTX was more demanding on ranged between 60-100fps with ultra settings and the g-sync via hdmi 2.0 was buttery smooth, I wonder how this can be improved upon via a module……

    Just thinking of the knowing if its 8 or 10bit I’d imagine if its a HDR mode then it would need to be 10bit? Like this comparison shows 2x 8bit 2x 10 bit and the 10bits have HDR (probably not an exact science or totally accurate!)

    So via amazon the LG34 is £200 currently, which one gets your vote?


    I don’t really think you can go wrong with either the AW3420DW or 34GN850. As mentioned in my previous post, the LG is more responsive for high refresh rates and goes up to 160Hz. But the Dell Alienware has a G-SYNC module which ramps down the pixel overdrive properly for lower refresh rates. With the LG you get some moderate overshoot at reduced refresh rates (<100fps if Adaptive-Sync is being used) whereas with the Dell the overshoot is weaker. But the pixel responses are a fair bit faster for some transitions on the LG, even at lower refresh rate - whether or not you'll appreciate the difference in either respect is difficult to say, but some would. I am a fan of the Dell Alienware aesthetics as well and I appreciate it's nice to have such a close match with your system.

    The LG 34GK950G has a G-SYNC module, hence the ‘G’. But the quality control has proven itself to be atrocious and it isn’t widely sold any more for that reason. The ‘F’ model is hardly brilliant in that respect either, early reports from the 34GN850 are far more promising. You could run the Samsung C49G95T at a non-native resolution with borders – I covered some potential issues there in my previous post as it doesn’t work as you might expect. Extra borders at the top and bottom and lack of ‘G-SYNC Compatible Mode’ for example. Speaking of G-SYNC, I’m confused by your comment: “g-sync via hdmi 2.0 was buttery smooth”. G-SYNC only works via DisplayPort.

    The HDR10 pipeline, used on all monitors being considered, makes use of 10-bit colour processing. With monitors that are ’10-bit’ (including 8-bit + FRC) the monitor itself does all of the necessary ‘processing’ for that, including any dithering stage. For monitors that are 8-bit and support HDR, the dithering stage is offloaded to the GPU. Some models, including a few you’re looking at, will actually offload the dithering stage to the GPU at the higher refresh rates they support. Because their simply isn’t the bandwidth to supply a full 10-bit signal monitor-side at the resolution and refresh rate. As we’ve covered in several reviews on models that do this sort of thing (example) it makes very little difference to the end result whether the monitor or GPU is handling this under HDR. The far more important consideration for HDR is whether the colour gamut is suitable (good coverage of DCI-P3 desired as short-term goal) and whether it offers a contrast bonus in HDR such as local dimming plus the maximum luminance supported. Additional considerations would be how the local dimming is implemented – for example, 10 dimming zones on a 49″ screen is pretty limiting per my first reply on the thread. But still better than none! The fact it’s VA helps as it has natively stronger contrast as well. But if you compare it to something like the Acer X35, the HDR experience is still rather limited.


    Its not often I’m able to help give some new info to a master of displays but here you go 🙂 : https://www.nvidia.com/en-gb/geforce/news/g-sync-compatible-2019-lg-tv-available-now/ I know it works as I’ve used it loads, I’m only using hdmi 2.0 and the G-Sync options are all there outputting from my laptop, maybe it reduces the chroma but 100% its working very nicely.

    So on the 34GN850 its like the dithering will turn on at the higher refresh rates say 144hz + but not at say 75hz….the reason I’m concerned of this is that I met a chap in a similar situation that has also suffered eye strain and he swears blind that once he got an 8bit panel without any FRC (to his knowledge) the pain went away, its a long shot but it would be an amazing bonus to me if I could reduce eye strain given how many hours I have to work some times (run my own website projects so sometimes 16hrs on bad days).

    LG C9 Bit levels

    Out of curiosity I just spent the last 30 mins testing different Bit rates when outputting to the LGC9 and found out some interesting stuff! So dithering will always tell you if its 8bit + dithering, I found that regarding HDR I can only get high bit and good resolution like 2560 x 1440 if I reduce the colour to 4:2:2/limited but this yields 12 bit interestingly…..I can also select 10 bit. Same result for 4k also.

    The only full RGB setting I can get is if pick 4k/30hz or 1080p/120hz but not 1440p, I didnt try 60hz as I prefer the higher frames.

    For standard SDR settings outputting to the LGC9 4k/60hz @8bit full RGB was fine. 1440p/120hz @8bit full RGB was fine also as was 4k/60hz, no dithering.\

    Although I don’t fully understand FRC I’d prefer to be able to set 8bit for daytime and 10bit for gaming with the LG 34, my suspicion would be once you click HDR mode it switches to dither but SDR would be fixed at 8bit but I could be wrong and then I’m not sure if theres an issue with the flicking of colours.

    I’ll stick to your recommendation, I’m trying to keep to amazon, if I cant buy an affiliate I’ll make a donation either way as a thanks for your your endeavors in helping me. I have taken a series of photos of the results also if you ever wish to see.


    Haha. Well, I may be a master of monitors but when it comes to TVs I can fall a bit behind with things like this. 😉 Thanks for posting the link and re-jogging my memory – absolutely right! No monitor currently supports G-SYNC (‘Compatible Mode’ or with module) via HDMI, but for certain TVs Nvidia will make specific exceptions because they don’t have DisplayPort as an option.

    I think my explanation regarding dithering was a bit pedantic and probably caused some confusion and unnecessary concern. I think it’s worth clarifying this. With a monitor like the 34GN850, the default behaviour for Nvidia GPUs will be to use an 8-bit signal under SDR. The monitor doesn’t completely shut off its dithering stage, but it very almost does. A very small amount of dithering with tiny brightness fluctuations may occur for some shades. But much less than would occur if you were feeding the display a 10-bit signal (under HDR, for example) and the driver was pumping out the full 10-bits by whatever means. It’s at a level that is very difficult to detect and is extremely unlikely to cause any eyestrain or accelerate eye fatigue by itself. The individual you met who experienced eye strain will have been comparing two displays which will differ in more ways than just the use of dithering, so there are many potential factors affecting his eyestrain on one monitor but not the other.

    I should also clarify something about the Odyssey G9. The observation I shared about its inability to correctly display 21:9 resolutions or use ‘G-SYNC Compatible Mode’ at the same time was not entirely clear. And I have no way of getting clarification at this stage. But I think it must’ve been referring to how the monitor behaves when it uses its own scaling process. I would assume you could use GPU scaling instead and I don’t see why that wouldn’t work. Although I seem to recall it doesn’t always behave as you might expect when using ‘G-SYNC Compatible Mode’ and sometimes likes to fill the entire screen no matter what you select. This was from the early days of the technology, though, I suspect this might’ve been addressed with newer Nvidia drivers. It isn’t something I’ve tested out myself as I’ve had no need to.

    And last but not least, I appreciate you wanting to support the website. 🙂


    So I’ve ordered it from £925 I couldn’t do via your links sadly but I have just made a donation for the great work your doing, hopefully that’s okay. Was germany but its £200 cheaper than the resellers from the UK and Amazon uk is oos; thankfully the Germans are better equipped! Seems nothing else is hitting UK now for another 3 weeks. The G9 seems stock is a few weeks away also just as a note but its soon. I also ordered a standing desk which will be great for posture issues (apparently we should only sit for 3hrs a day!).

    I stand corrected, you did say no monitor can do via G-SYNC via HDMI :D, although I honestly see my TV more as a monitor first and TV second just simply because its better than any monitor I can see for this price bracket, it baffles me how an OLED Dolby Vision (True? 10-12 bit) panel hasn’t been adapted into a computer monitor….the LG C9 cost me £1150 brand new and weirdly I’m receiving something that feels inferior for not too much a difference in price, albeit the 21:9 aspect will offer something new and I could never work from my TV; its nice I’ve the option of full HDR gaming or the 21:9, I imagine games like Battlefield would be amazing on that kind of aspect and more immersive.

    On the topic of HDR and Bit, the monitor was summed up nicely by LG here, and you earlier made similar suggestion regarding chroma:

    3440×1440 10bit(RGB444) at 144Hz,
    3440 x 1440 8bit(RGB444) at 160Hz(O/C),
    3440 x 1440 10bit(YUV422) at 160Hz(O/C)

    What works out to be the best picture or best performing picture out of these? 8bit full RGB or 10bit reduced chroma? Interesting to know about the possibly tiny concern about dithering, I can always test the conditions end of the day, and at worst if it does engage more so at 10bit well thats fine too as its not the full working day so something.

    Just something else I noticed, I read you had tested HDR on a couple games, one being Shadow of the Tomb Raider (epic experience imo!). I had to desktop turn HDR on as it wasn’t available as an option, when I did this however my frames dropped by 50%, I was working with 1440p and it was probably full RGB, I wonder if changing the chroma can increase the frames but still have 10+ bits working, did you discover anything and where your frames impacted as badly? When I flip the setting on other games, that are ingame HDR optional I hardly drop any frames but the image is so much better, my goodness even the desktop is a thing of beauty when HDR is switched on, SDR totally pales in comparison although I’m a noob to all this and haven’t yet calibrated anything (I don’t know how). I don’t even know how to make the most of scaling from the GPU and various things I should know with the tools at my disposal! 😀

    Your input is great, I can see the science brain in all of it, its not something I sometimes read and understand first time but its there and I’ll surely glance over it repeatably as my comprehension grows and refer back; its all quite fascinating really.


    It isn’t enabling HDR in Windows that would drop the frames. But rather, some titles (including Shadow of the Tomb Raider) use additional processing (and a few extra effects and filters) at the same time under HDR.

    Regarding the best signal type to use, as I said in an earlier reply it makes very little practical difference whether the GPU or monitor is handling the dithering stage. I also linked to a review example where this was explained. Chroma subsampling has little to no visual effect on game titles, except some with slim text. It can become fringed. There’s also a performance impact related to using HDR and reduced chroma on most titles. Refer to our review of a model which uses various different signal types depending on refresh rate and relies on chroma subsampling at maximum refresh rate. It’s best to just see for yourself how similar all of the signal types are under HDR, visually, and not overthink things.

    I appreciate the donation and I hope you enjoy the 34GN850. I look forward to your thoughts when you’ve had a play with it. Just don’t expect the HDR to be anywhere near as spectacular as on your TV.


    Yeah I’ll have a play, not sure its an easy thing to spot a difference without side by side comparison, tbh my standards aren’t elite level I’m blown away by everything, mainly the smoothness but then I grew up on 8bit games back in the day! I’ll update when it comes cheers.


    Interestingly I’ve no eta from Amazon.de but in the meantime looks like the 38GL950G has just dropped in price to £1499 although may have been this price point before.

    I paid £925 for the 34GN850, do you think the 950G is actually better overall? TH site rated it very highly although user reviews are a mixed bag here on Amazon. Things reporting like noisy fans due to no active cooling (TH site states LG advised there are no fans?), g-sync creates issues until it warms up, heat seem the main culprits and the actual score isn’t great although just 18 reviews but is readily available with the price cut.

    Just interested to hear your thoughts as the jury still seems to be out on this one, I would link the other sites glowing review but not sure if that’s permissible but you may have already seen.


    I’ve already covered this. Entirely depends on your own personal preferences for size and resolution. That plus marginally weaker contrast on the larger model plus the G-SYNC module (lower overshoot at low refresh rates). It’s up to you whether you like that sort of experience enough to warrant the extra price. I’m not sure it’s really worth paying ~1.5x the price of the 34GN850, but you might think differently.



    Ok so decided to get the LG 38GL950G in the end, I just love the fact that its 1600 / IPS / 160hz (can clock to 175) / 21:10 / G-sync capable (proper module not software) and the G9 or the GN is lacking in these features; turns out it was a good call as I’m running 1600 resolution without any issues frame wise so I now have a little bit of future proofing for the extra £ I had to spend. Speaking of which, Amazon sent me the wrong lead and then refunded me £200 back to buy a new one, which cost £30! So the end price was £1330….

    I no longer experience tearing in games I used to get on the LG C9 even though I’m now running at a higher resolution thanks to the gync hardware. The fan issue isn’t an issue, I cant hear it above my laptop fans at all and also I have to put my head to the monitor to even get the faintest noise.

    The control software that sorts the RGB lighting requires a ghastly USB a>b lead which always has to be plugged in and boots on startup which requires setting each time which is abit annoying but not that annoying its a problem.

    HDR is weird, like when comparing to my LG C9 if I switch HDR on there it really comes to life and SDR on desktop is lifeless/washed out. Its in reverse with this monitor, if I select HDR is actually looks washed out. So that’s a result actually as after calibrating, the monitor is very sharp and bright as it is, I downloaded a profile to adjust things although not sure.

    I normally use a software called Iris which allows you to turn the monitor brightness to full and then reduce the brightness in the software without lowering the refresh rate but its been causing the monitor to goto sleep oddly so I’ve lost that, instead now I’m at 75% brightness and have turned off blue light which seems to help with the eyes a little. I just wanted to ask, with IPS tech, does the refresh lower as you decrease the brightness like with what happens with PWM monitors?

    Also wondering, in games like Street Fighter V etc, I can’t get my full aspect ratio so end up with borders, I’ve noticed that there are settings in the nvidia panel here options. But I’ve tried to use this to override the game settings but nothing changed. Also options for using monitor or GPU to scale, maybe you have a link to these settings somewhere? Be nice to experience 21:10 on these games, be amazing actually.

    My next step is to get a wall mount from Germany and also an under desk treadmill so I can get 10k steps done a day whilst working, my desk is adjustable to I don’t plan to sit around all day like a lump, been there and done that for 20 years resulting in posture issues!


    Excellent – really glad you’re enjoying the LG 38GL950G and I appreciate your feedback. It’s certainly a beastly monitor and it sounds like you’re making very good use out of it! Also glad the fan hasn’t been a problem for you. I think a lot of concerns regarding fans in monitors are overblown (pun intended). Seems some are louder than others and it’s yet another ‘quality control’ issue to look out for. I think people have enough of them to contend with on monitors.

    The thing to be aware of with monitors that have a G-SYNC module is that they only support monitor-side scaling via HDMI. And obviously that bypasses the G-SYNC module and greatly limits the monitor capability, it’s only designed as a secondary use with games consoles etc. That’s why you can only see ‘Perform scaling on: GPU’ listed without ‘Monitor’ being selectable in the drop-down. You’re limited to whatever scaling support the GPU will offer, which can be quite limiting. As far as I know Street Fighter V doesn’t provide ‘UltraWide’ support of any sort natively, so how are you wanting/expecting it to be displayed? You’ll either get black borders or a stretched image depending on settings you’ve selected in Nvidia Control Panel.

    Love the Baby Yoda prop, by the way. 😀


    Wow never knew that, I did buy a thunderbolt 3 cable but I’m having issues with it as even though it registers the monitor there is no display, almost like the monitor can read the laptop even though the laptop reads the monitor; I’d read gsync can pass through this cable although I can’t find the blog anymore…that said the settings did show as gsync etc, the cable was a returned Amazon warehouse one so there’s a chance it’s faulty, I bought a starlink one and it was half price with a 3 year guarantee so might be able to swing a replacement.

    As for SFV or others, I’ve seen it in forums like this and after trying MKII which is amazingly well accommodating for ratios I think it’s worth a bit of effort to play in this way.

    Could I pick your brain about IPS refresh if you turn the brightness down? The brightness is a bit much for me but lowering refresh is worse than 100% brightness, I did reduce red colour which seemed to help abit

    Yoda says thanks 🙂


    Ah yes, sorry I missed that question before. As with all G-SYNC monitors the 38GL950G has a flicker-free backlight which uses DC dimming. The brightness regulation is not linked to the refresh rate (possible minor deviation aside). You should also use the monitor’s brightness control rather than a digital adjustment, which is what software does. You lose contrast by making such adjustments, because your black point remains exactly the same whilst your whites (and other light shades) become dimmer. If it works for you in terms of viewing comfort by all means go for it, but it’s better to use the brightness control of the monitor and try to get used to that. You may find lowering contrast on the monitor helpful if you’re liking the sort of adjustments the software is making. One of which is to reduce contrast, which some would find more comfortable on the eyes. But you need to give your eyes time to adjust to a new monitor, especially one so dramatically different to what you were using before.


    Great I’ll have a go with that thank you very much :). Might just have to bite the bullet on altering the FOV (not sure if thats the right way to say) on fighting games as it all seems to be abit hackish rather than settings ……I did just have a rather crazy idea of playing some arcade games on this though, can you imagine turning the thing 90 degrees and then having a full vertical for some of the games! 😀

    On another note, I am really pleased that I can achieve an ultra setting gaming experience on most games, the best one so far has been Wreckfest of all games….given its the most busiest game I’ve probably got I’m blown away that I get 120fps on ultra settings and super smooth; all at 1600….

    That leads me to my last little thought, is 1600 the same jump from 1080 > 1440 ? I know people think of this as 1k, 2k, 3k 4k so 1600 would be poorly labelled 3k but I was wondering technically, is it much a jump from 1440 and is it therfore alot closer to 4k, it seems to feel better to me but very hard to analyse.


    The 38GL850G has a 37.5″ panel with 3840 x 1600 resolution. The ‘K’ (best avoided) assigned to this would technically be ‘4K’ simply because it refers to a slight rounding of the horizontal component of resolution. It ignores the vertical component and also ignores the screen size and hence pixel density. To really think about the experience the screen brings you need to consider all of those things. The pixel density is very important and that’s 110.93 PPI (Pixels Per Inch) – very similar to a 27″ 2560 x 1440 (WQHD) model @108.79 PPI or 34″ 3440 x 1440 model @109.68 PPI. Very different from the distinct ‘4K’ look you’d get from a ~32″ or smaller ‘4K’ UHD model (~138 PPI).

    Now that’s by no means a poor pixel density – far from it. Many people are very happy with this kind of pixel density, which is why 27″ 2560 x 1440 models are so popular, why 34″ 3440 x 1440 are well-loved and why 37.5″ 3840 x 1600 models give such a nice overall experience. But it’s really the physical screen size that rounds off the experience in this case. You’ve got roughly the same height as a 30″ 16:9 monitor and significantly more width (comparison). Compared to a 34″ 21:9 model you’ve got some extra height and some extra width. So having that kind of pixel density spread across such an area (plus the number of pixels for productivity purposes) can certainly deliver some smiles. 😉


    Cheers, makes sense. I guess with the extra 2-3 inches vertically they had to make it 1600 to maintain the pixel density in that case. The answer here is probably no but if I made the image 34 inches could the pixel density increase, or in a game that has borders or would it just remain the typical?

Viewing 20 posts - 1 through 20 (of 28 total)
  • You must be logged in to reply to this topic.