Buying a monitor? Please refer to this post before purchasing.
New user? Register here.
- AuthorPosts
- November 27, 2024 at 6:20 pm #76681EsaT
Not sure how many have found this “worst case torture testing” done by that channel, but it has now reached 9 months mark.
While having this wear sure isn’t good thing, it’s definitely positive that increase from 6 month mark is minimal.
And that monitor is under high stress with very high brightness of 200 nits.Not really sure what’s the actual nits value, but I’m running my Acer X32 FP below 25% brightness setting.
So pretty likely it’s good amount lower and in my use wear would be significantly slower.Though one big question is if rate of pixel wear is in linear relation to brightness, or in exponential/logarithmic.
Linear would be safer assumption, but could see wear increasing at exponential rate at least when pushing brightness in warmer room.November 27, 2024 at 6:27 pm #76683PCM2Yeah, I’ve been following along this series of testing with interest and I feel the latest results are quite reassuring. I’d also be interested in seeing how things would fare with a lower brightness. Most people won’t be running at the maximum brightness (roughly 250 nits for a QD-OLED) so they can expect better results because of this – but exactly how much better is a great question.
November 27, 2024 at 8:54 pm #76685EsaTFound one article hinting that wear accelerates exponentially with brightness:
How does this factor into brightness? You can’t avoid degradation, but that process is sped up by feeding the diodes more current and increasing their brightness. “You basically make the choice of either going brighter or having a longer lifetime,” says Qiu. “It’s typically a trade-off.”
As an example, Qiu pointed to theoretically increasing the brightness by 20%. “The end consumers want it to be 20% brighter, right? So it could be, instead of it needing to be 20% brighter, it could have 1.5x longer lifetime.”
- AuthorPosts
- You must be logged in to reply to this topic.