this post was submitted on 05 Nov 2023
203 points (94.7% liked)

Technology

59377 readers
3673 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.

all 37 comments
sorted by: hot top controversial new old
[–] ShortFuse@lemmy.world 91 points 1 year ago* (last edited 1 year ago) (2 children)

The TL;DR is now pixels get tracked for how long they've been lit. Then the device can evenly burn out the other pixels so the usage is uniform. The trade off is you are going to lose max brightness in the name of screen uniformity.

The other point is a shifting of the TFT layer that people think is burn-in, but it's actually image retention. But, it is solved by these TV maintenance cycles. Just hope that this compensation cycle actually runs since some panels just fail to run them.

Checkout this RTings video for a good overview of lots of different TV brands and how they perform.

PS: Burn-in is actually a misnomer from the CRT era. There's nothing burning in; the pixels are burning (wearing) out.

[–] Chetzemoka@startrek.website 7 points 1 year ago* (last edited 1 year ago) (1 children)

Thank you for the summary. My takeaway is: So you're saying I should still get a mini LED TV

[–] ShortFuse@lemmy.world 7 points 1 year ago* (last edited 1 year ago) (1 children)

I have both:

  • an 85" TCL R655 with a bunch of dimming zones that works great in my sunlight-heavy living room for both daytime viewing and family movie night.

  • a 55" LG C1 in my gaming/home-office/theater room with blackout curtains that is great for PC gaming and awesome theater experience.

I would say it depends on your viewing environment. The inability of an OLED to get bright can ruin the experience. But my game room has blackout curtains and it's enclosed.

I just recently moved from 34" Ultrawide to just mounting the 55" onto my desk. It's oversized for my viewing distance, but 4K resolution is 8million pixels so I rarely run apps in or near fullscreen anymore. I think a 42" LG OLED is perfect for PC. (Great out of the box calibration and 120hz G-Sync). Though QD-OLED on Samsung are technically better, I don't trust them to run compensation cycles.

If you're worried about burn-in on PC, just set a screensaver to black your screen in 2 to 5 minutes. That's why they were invented anyway (CRT era). For regular media consumption it's a non-issue. Rtings set a static image for 120 hours on a Sony OLED and it basically went away with one compensation cycle.

[–] amenotef@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

I have a budget Samsung 55" NU7400 and I can't see shit while playing a PS5 game with HDR during the day. I need to close the blackout curtains otherwise I see my face reflected.

Next TV I buy I will do some research and spend a bit more money, 120Hz, more nits, VRR, etc.

[–] ShortFuse@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (2 children)

NU7400 has a peak of 337 nits and that's with the poorer contrast ratio of LCD. My LG C1 is 780 nits. I still find it a bit weak with the lights on so I can't imagine 330 on LCD.

Yeah, HDR is meant to be watched in a 5-nit environment, but sometimes that's just not reasonable. While my LG is technically better, bright TV shows like Rings of Power are more enjoyable with the 1500 nits my TCL can output. Once that ABL (Automatic Brightness Limiter) kicks in for the OLED, you absolutely need the blackout curtains.

[–] amenotef@lemmy.world 1 points 1 year ago

One thing that I also noticed is that my monitor (which has i think 350 nits / LG 27GL850-B) it is much easier and clear to see at direct sunlight because of the anti-glare screen.

But I doubt that antiglare/matte displays is a thing you find on TVs.

[–] amenotef@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Thanks for the hints. So that means that in a bright room, a TV with 1500+ nits is ideal for HDR right?

But even with a 1500 nits TV, HDR will be still much better in a dark room? (Where OLED shines?).

[–] PipedLinkBot@feddit.rocks 5 points 1 year ago

Here is an alternative Piped link(s):

this RTings video

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] Wahots@pawb.social 75 points 1 year ago* (last edited 1 year ago)

"If you're a consumer planning to use an OLED monitor for gaming for two to three years, it's a good choice. Beyond that, we don't yet have enough real-world data to make a definitive judgment," Karatsevidis said.

I didn't like the article that much, since it kinda rides on the fact that people are replacing monitors every three years, which most won't do.

Most people won't turn on any non-default settings to mitigate wear. They'll roll light mode, won't turn down the brightness, won't turn on savers, and will leave spotify on while the Taskbar is displayed. 5-8 years of use later, that will probably amount to uneven wear on the panel, making it more likely to go to a landfill rather than be sold secondhand for a new lease on life.

[–] Whirling_Cloudburst@lemmy.world 38 points 1 year ago (1 children)

This is a good read.

On a side note: Anyone remember the story of the guy that went on vacation and his buddy watching the place left the gay porn on pause on the plasma screen as a joke?

[–] Kecessa@sh.itjust.works 26 points 1 year ago* (last edited 1 year ago) (1 children)
[–] cheese_greater@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

This is gold

Edit: or gOLEd

[–] Buddahriffic@lemmy.world 0 points 1 year ago

Pranks involving permanent damage aren't gold. Unless OP did something to deserve this, fuck that room mate.

[–] vzq@lemmy.blahaj.zone 38 points 1 year ago (2 children)

I’m still using a monitor from 2010 on a daily basis. This consumerist throw away bullshit can go crawl back to the 20th century and die.

[–] mild_deviation@programming.dev 3 points 1 year ago (1 children)

CCFL-lit LCDs are so inefficient compared to modern LED-lit LCDs that you've probably spent enough more on electricity by now to have bought a more efficient monitor.

I can't speak to the environmental impact, though. Producing the new monitor emitted some amount of CO2, and powering each monitor takes some amount of CO2 per unit time. At some amount of use, the newer monitor will have lower lifetime CO2 generation than your old monitor.

[–] turmacar@lemmy.world 3 points 1 year ago

Not OP but my electicity is <$0.10 / kWh because of where I live. It seems like it would take much more than 13 years to hit the break even point on upgrading the monitor just because of energy efficiency.

Even if the newer monitor has less of a lifetime environmental impact, throwing out the old still working one is still wasteful. It's already made and working. Using it longer lessens your environmental impact. If you repair the old one when it eventually breaks, that's still less of an impact than an extra ~20% electricity usage. Especially since electricity generation is getting greener all the time.

[–] cmnybo@discuss.tchncs.de 32 points 1 year ago (3 children)

Burn in will always be a problem, you can't get rid of it. Sure there are ways to minimize it and monitors can try to hide it, but eventually you will have a task bar, window borders, and desktop icons burned into the screen.

[–] tun@lemmy.world 21 points 1 year ago

True.

I still have my 1080p LCD monitor from 2010s working fine.

According to the article OLED has 5% chance to have burn in after 2 years. The article also mentioned Rtings test 10 years usage for OLED monitors.

[–] narc0tic_bird@lemm.ee 9 points 1 year ago (1 children)

It's in the "nature" of OLED that it eventually wears down. My understanding is that technically, it's not burning in, but burning out, and what's perceived as burn-in is irregular wear of the different color channels or different brightness of the individual pixels (especially with HDR content).

[–] AProfessional@lemmy.world 5 points 1 year ago (1 children)

Is the nature of all self-emissive displays, even micro-leds as they become more common.

[–] narc0tic_bird@lemm.ee 2 points 1 year ago

Sure, but it's more pronounced on OLED displays. We'll see how microLED ages once we get some mainstream panels, but as most other display technologies are evenly backlit over the whole display area instead of the pixels emitting light on their own, they wear evenly and the subpixels/color channels don't wear.

[–] EncryptKeeper@lemmy.world 2 points 1 year ago

That’s true, but at the same time LED TVs have a huge problem with bloom issues that are essentially a lottery because most manufacturers don’t consider it an actual defect and won’t replace it.

[–] Cossty@lemmy.world 21 points 1 year ago* (last edited 1 year ago) (1 children)

I will not use oled monitors with desktop pc. With my usage, I would have burn in in 2 years if not sooner. Not counting that, I would still have that thought in my mind, that if I use it more, I will get burn in, and anytime I'm leaving the pc, even if only for a minute, I should turn it off. I like good contrast and blacks, so my next monitor will probably be good VA with local dimming.

[–] hedgehog@ttrpg.network 2 points 1 year ago

On the upside, if you burn in an LG OLED in under two years, it’s covered under the two year warranty. (I didn’t check other manufacturers’ policies; some might do the same thing.)

I have a laptop with an OLED screen and I have that same thought every time I use it undocked. The screen’s very pretty, though.

[–] Send_me_nude_girls@feddit.de 20 points 1 year ago* (last edited 1 year ago) (2 children)

I'm not going to change my habits for a monitor. Hiding the taskbar is annoying, as Windows randomly has the habit of not showing it.

Also there will be static elements on it for 16+ hours at least on the weekend. 8 to 13 under the week. Some buttons are bright some orange.

Brightness can't be lowered much as I don't have many options to mitigate the sun unless I fully cover the window (bright reflection neighbor houses at different daytimes + normal sun + mirrors on walls etc.)

What if I do a 48h gaming session? Can I throw it in the trash afterwards?

[–] deur@feddit.nl 2 points 1 year ago (1 children)

Could try to adapt your gaming sessions to include short breaks to help prevent injury, and grab a snack maybe. 10 minute breaks every hour (or few hours :) ) where you turn the monitor off may help?

[–] EncryptKeeper@lemmy.world 2 points 1 year ago

That’s just far more thought than anybody should be putting into monitor usage.

[–] milkjug@lemmy.wildfyre.dev 2 points 1 year ago

Same, it’s the biggest annoyance that’s putting me off an OLED at the moment. I don’t like the idea of having to baby my things and fretting over the small meaningless details with kids’ gloves.

That and also because DP 2.1 still isn’t a thing in 2023 and only God knows why.

[–] revoopy@programming.dev 13 points 1 year ago (1 children)

I only read the headline but isn't part of it WOLED. Using dedicated white subpixels reduces the workload of the other pixels

[–] tun@lemmy.world 8 points 1 year ago

According to the article, QD-OLED can also improve the image burn-in by using firmware and algorithm.

Counting the hour uses, leveling with most burnt pixels, reduce the luminosity, giving pixels extra luminosity to use when there is burnt in.

[–] autotldr@lemmings.world 7 points 1 year ago

This is the best summary I could come up with:


People tend to display static images on computer monitors more frequently than on TVs—things like icons, taskbars, and browser address bars—making burn-in risk a concern.

"Industry chatter," Dough co-founder Konstantinos Karatsevidis told me, showed that burn-in affected "around 5 percent of users" after two years.

The latest models have improved materials and firmware that make them significantly more resistant to burn-in than they were years ago.

Roland Wooster, chair of VESA’s Display Performance Metrics Task Group, told me that physical design changes have also helped.

By counting the time each subpixel is displayed and at what brightness, a "wear level" can be determined for each pixel, using an algorithm to estimate the luminance degradation this can be compensated for.

The companies that make monitors can implement a range of firmware, software, and hardware techniques to help fight burn-in.


The original article contains 656 words, the summary contains 138 words. Saved 79%. I'm a bot and I'm open source!

[–] LoafyLemon@kbin.social 2 points 1 year ago

My 2009 LCD panel still works perfectly and has been repurposed as a dining room TV. While it may not excel in reproducing black levels, it continues to function just as it did when I first purchased it. I am not going to bother with OLED if it means having to replace the screen every 2-3 years.