the__alchemist 3 days ago

So, HN, are HDR monitors worth it? I remember ~10 years ago delaying my monitor purchase for the HDR one that was right around the corner, but never (in my purchasing scope) became available. Time for another look?

The utility of HDR (as described in the article) is without question. It's amazing looking at an outdoors (or indoors with windows) scene with your Mk-1 eyeballs, then taking a photo and looking at it on a phone or PC screen. The pic fails to capture what your eyes see for lighting range.

13
kllrnohj 3 days ago

HDR gaming: Yes.

HDR full screen content: Yes.

HDR general desktop usage: No. In fact you'll probably actively dislike it to the point of just turning it off entirely. The ecosystem just isn't ready for this yet, although with things like the "constrained-high" concepts ( https://www.w3.org/TR/css-color-hdr-1/#the-dynamic-range-lim... ) this might, and hopefully does, change & improve to a more pleasing result

Also this is assuming an HDR monitor that's also a good match for your ambient environment. The big thing nobody really talks about wiith HDR is that it's really dominated by how dark you're able to get your surrounding environment such that you can push your display "brightness" (read: SDR whitepoint) lower and lower. OLED HDR monitors, for example, look fantastic in SDR and fantastic in HDR in a dark room, but if you have typical office lighting and so you want an SDR whitepoint of around 200-300 nits? Yeah, they basically don't do HDR at all anymore at that point.

Sohcahtoa82 3 days ago

> HDR gaming: Yes.

The difference is absolutely stunning in some games.

In MS Flight Simulator 2024, going from SDR to HDR goes from looking like the computer game it is to looking life-like. Deeper shadows with brighter highlights makes the scene pop in ways that SDR just can't do.

EDIT: You'll almost certainly need an OLED monitor to really appreciate it, though. Local dimming isn't good enough.

wirybeige 3 days ago

I use HDR for general usage, Windows ruins non-HDR content when HDR is enabled due to their choice of sRGB tf. Luckily every Linux DE has chosen to use the gamma 2.2 tf, and looks fine for general usage.

I use a mini-led monitor, and its quite decent, except for starfields, & makes it very usable even in bright conditions, and HDR video still is better in bright conditions than the equivalent SDR video.

https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm

hbn 2 days ago

Windows HDR implementation is janky as hell. For months after I got my monitor I couldn't take screenshots because they'd all appear completely blown out, like you cranked the brightness to 300%.

Eventually I did some digging and found there's a setting in Snipping Tool that just... makes screenshots work on HDR displays.

It also seems to add another layer of Your Desktop Trying To Sort Its Shit Out when launching a game that's full screen. Sometimes it's fine, but some games like Balatro will appear fine at first, but then when you quit back to the desktop everything is washed out. Sleeping my PC and waking it back up seems to resolve this.

I recently played through Armored Core VI, and it supports HDR, but whenever I adjust my volume the screen becomes washed out to display the volume slider. Screenshots and recordings also appear washed out in the resulting file.

qingcharles 2 days ago

Agree. Wide gamut and HDR is janky as hell on Windows. I have multi-mon with one SDR and one HDR and that plays havoc with things. But even Microsoft apps aren't updated. I'm pretty certain even Explorer still doesn't support HDR or wide gamut for thumbnails so everything looks either under or oversaturated in the previews. If you open stuff in the default Photos app there is a "flash of content" where it displays it in the wrong profile before it maps it to the correct one, too.

wirybeige 2 days ago

I always thought the "Your Desktop Trying To Sort Its Shit Out" part was a necessary evil, but other platforms don't suffer from this (at least from what I can tell); the state of HDR on Windows is very disappointing, even just adjusting the TF to gamma 2.2 would make it substantially better. Watching all your non hdr content's blacks become gray is terrible. I assume the washed out appearance comes from it giving up on doing SDR->HDR for the desktop.

My brother got an OLED monitor & was telling me how bad his experience was on Windows, & he recently switched to Linux & does not have the issues he was complaining about before. Ofc, downsides to hdr on Linux (no hdr on chromium, hdr on Firefox is unfinished) atm, but the foundation seems better set for it.

98codes 3 days ago

Every few years I turn on that HDR toggle for my desktop PC, and it never lasts longer than a day or two.

Top tip: If you have HDR turned on for your display in Windows (at least, MacOS not tested) and then share your screen in Teams, your display will look weirdly dimmed for everyone not using HDR on their display—which is everyone.

arduinomancer 3 days ago

HDR on desktop in windows looks straight up broken on some HDR monitors I’ve tried

Like totally washed out

nfriedly 3 days ago

A lot of monitors that advertise HDR support really shouldn't. Many of them can decode the signal but don't have the hardware to accurately reproduce it, so you just end up with a washed out muddy looking mess where you're better off disabling HDR entirely.

As others here have said, OLED monitors are generally excellent at reproducing a HDR signal, especially in a darker space. But they're terrible for productivity work because they'll get burned in for images that don't change a lot. They're fantastic for movies and gaming, though.

There are a few good non-OLED HDR monitors, but not many. I have an AOC Q27G3XMN; its a 27" 1440p 180hz monitor that is good for entry-level HDR, especially in brighter rooms. It has over 1000 nits of brightness, and no major flaws. It only has 336 backlight zones, though, so you might notice some blooming around subtitles or other fine details where there's dark and light content close together. (VA panels are better than IPS at suppressing that, though.) It's also around half the price of a comparable OLED.

Most of the other non-OLED monitors with good HDR support have some other deal-breaking flaws or at least major annoyances, like latency, screwing up SDR content, buggy controls, etc. The Monitors Unboxed channel on YouTube and rtngs.com are both good places to check.

Sohcahtoa82 3 days ago

I think an OLED is basically an absolute necessity for HDR content.

My current monitor is an OLED and HDR in games looks absolutely amazing. My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR. Local dimming only goes so far.

vlmutolo 2 days ago

Modern mini-led monitors are very good. The “local” dimming is so local that there isn’t much light bleed even in the worst-case situations (cursor over black background makes it particularly apparent).

The advantage of LEDs is they’re brighter. For example, compare two modern Asus ProArt displays: their mini-LED (PA32UCXR) at 1600 nits and their OLED (PA32DC) at 300ish nits. The OLED is 20% more expensive. These two monitors have otherwise comparable specs. Brightness matters a lot for HDR because if you’re in a bright room, the monitor’s peak brightness needs to overpower the room.

Plus for color managed work, I think LED monitors are supposed to retain their calibration well. OLEDs have to be frequently recalibrated.

And so-called micro-LEDs are coming soon, which promise to make “local” so small that it’s imperceptible. I think the near-term future of displays is really good LEDs.

nfriedly 3 days ago

> My previous was an IPS that supported HDR, but turning it on caused the backlight to crank to the max, destroying black levels and basically defeating the entire purpose of HDR

Yeah, that's kind of what I meant when I said that most monitors that advertise HDR shouldn't.

The AOC monitor is the third or fourth one I've owned that advertised HDR, but the first one that doesn't look like garbage when it's enabled.

I haven't gone oled yet because of both the cost and the risk of burn-in in my use case (lots of coding and other productivity work, occasional gaming).

mxfh 2 days ago

Avoid cranking up the OLED brightness over 70% for static content and absolutely never drive SDR-Reds into the HDR range by using fake HDR-modes when having brightness high.

I have a LG 2018 OLED that has some burnt in Minecraft hearts because of that, not from Minecraft itself, but just a few hours of minecraft Youtube video in those settings from the built in youtube client, but virtually no other detectable issues after excessive years of use with static content.

You only see them with fairly uniform colors as a background where color banding would usually be my bigger complaint.

So burn-ins definitely happen, but they are far from being a deal breaker over the obvious benefits you get vs other types of displays.

And driving everything possible in dark mode (white text on dark bg) on those displays is even the logical thing to do. Then you dont need much max brightness anyway and even save some energy.

eschatology 3 days ago

Yes but with asterisks; Best way I can describe it:

You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.

Early HDR screens were very limited (limited dimming zones, buggy implementation) but if you get one post 2024 (esp the oled ones) they are quite decent. However it needs to be supported at many layers: not just the monitor, but also the operating system, and the content. There are not many games with proper HDR implementation; and even if there is, it may be bad and look worse — the OS can hijack the rendering pipeline and provide HDR map for you (Nvidia RTX HDR) which is a gamble: it may look bleh, but sometimes also better than the native HDR implementation the game has).

But when everything works properly, wow it looks amazing.

kllrnohj 3 days ago

> You know the 0-10 brightness slider you have to pick at the start of a game? Imagine setting it to 0 and still being able to spot the faint dark spot. The dynamic range of things you can see is so much expanded.

Note that HDR only actually changes how bright things can get. There's zero difference in the dark regions. This is made confusing because HDR video marketing often claims it does, but it doesn't actually. HDR monitors do not, in general, have any advantage over SDR monitors in terms of the darks. Local dimming zones improve dark contrast. OLED improves dark contrast. Dynamic contrast improves dark contrast. But HDR doesn't.

eschatology 3 days ago

My understanding is that on the darker scenes (say, 0 to 5 in the brightness slider example), there is difference in luminance value with HDR but not SDR, so there is increased contrast and detail.

This matches my experience; 0 to 5 look identically black if I turn off HDR

kllrnohj 3 days ago

You may have a monitor that only enables local dimming zones when fed an HDR signal but not when fed an SDR one, but that would be unusual and certainly not required. And likely something you could change in your monitors controls. On things like an OLED, though, there's no difference in the darks. You'd see a difference between 8bit and 10bit potentially depending on what "0 to 5" means, but 10-bit SDR is absolutely a thing (it predates HDR even)

But like if you can't see a difference between 0 to 5 in a test pattern like this https://images.app.goo.gl/WY3FhCB1okaRANc28 in SDR but you can in HDR then that just means your SDR factory calibration is bad, or you've fiddled with settings that broke it.

SomeoneOnTheWeb 3 days ago

IF you have a display that can it roughly a 1000 nits, then for movies and games yes definitely the difference with SDR is pretty huge.

If you have say a 400 nits display the HDR may actually look worse than SDR. So it really depends on your screen.

simoncion 3 days ago

Honestly, I find the extended brightness FAR less important than the extended color gamut. I have a ~300 nit VA monitor that I'm generally quite happy with and that looks fantastic with well-built HDR renderers.

Given that monitors report information about their HDR minimum and maximum panel brightness capabilities to the machine they are connected to, any competently-built HDR renderer (whether that be for games or movies or whatever) will be able to take that information and adjust the picture appropriately.

Jhsto 3 days ago

I've been thinking of moving out of the Apple ecosystem but after seeing Severance on my iPhone Pro screen I feel like I want the keep the option to have the same HDR experience for movies specifically. With HDR support landing in Linux just a month ago I'm inclined to spend on a good monitor. However, I have IPS HDR 600 monitor but I never felt that the screen was as glorious as the iPhone screen.

I'd also be interested in hearing whether it makes sense to look into OLED HDR 400 screens (Samsung, LG) or is it really necessary to get an Asus ProArt which can push the same 1000 nits average as the Apple XDR display (which, mind you, is IPS).

esperent 3 days ago

I think it depends on the screen and also what you use it for. My OLED is unusable for normal work in HDR because it's designed around only a small portion of the screen being at max brightness - reasonable for a game or movie, but the result is that a small window with white background will look really bright, but if I maximize it, it'll look washed out, grey not white.

Also the maximum brightness isn't even that bright at 800 nits, so no HDR content really looks that different. I think newer OLEDs are brighter though. I'm still happy with the screen in general, even in SDR the OLED really shines. But it made me aware not all HDR screens are equal.

Also, in my very short experiment using HDR for daily work I ran into several problems, the most serious of which was the discovery that you can no longer just screenshot something and expect it to look the same on someone else's computer.

simoncion 3 days ago

> ...the most serious of which was the discovery that you can no longer just screenshot something and expect it to look the same on someone else's computer.

To be pedantic, this has always been the case... Who the hell knows what bonkers "color enhancement" your recipient has going on on their end?

But (more seriously) it's very, very stupid that most systems out there will ignore color profile data embedded in pictures (and many video players ignore the same in videos [0]). It's quite possible to tone-map HDR stuff so it looks reasonable on SDR displays, but color management is like accessibility in that nearly noone who's in charge of paying for software development appears to give any shits about it.

[0] A notable exception to this is MPV. I can't recommend this video player highly enough.

pornel 2 days ago

HDR when it works properly is nice, but nearly all HDR LCD monitors are so bad, they're basically a scam.

The high-end LCD monitors (with full-array local dimming) barely make any difference, while you'll get a lot of downsides from bad HDR software implementations that struggle to get the correct brightness/gamma and saturation.

IMHO HDR is only worth viewing on OLED screens, and requires a dimly lit environment. Otherwise either the hardware is not capable enough, or the content is mastered for wrong brightness levels, and the software trying to fix that makes it look even worse.

SebastianKra 3 days ago

Apple's Displays yes. But I got a Phillips 4k OLED recently, and I'm already regretting that decision. I need to turn it off every 4 hours to refresh the pixels. Sometimes an entire line of pixels is brighter than the rest. I wiped it with a cloth while pixel refresh was running, and then saw burned in streaks in the direction of the wipe.

And thats now that all the LEDs are still fresh. I can't imagine how bad it will be in a few years.

Also, a lot of Software doesn't expect the subpixel arrangement, so text will often look terrible.

aethrum 3 days ago

For gaming, definitely. An HDR Oled monitor is so immersive.

baq 3 days ago

I've had an OLED TV since 2017 and the answer is a resounding yes... if you get an OLED and use it for movies or full screen gaming. Anything else is basically pointless.

For desktop work, don't bother unless your work involves HDR content.

whywhywhywhy 3 days ago

Dunno if it's just my screen or setup but on windows I have a Dell U4025QW and HDR on the desktop just looks strange, overly dull. Looks good in games but I have to manually turn it on and off each time on the screen.

On my Macbook Pro only activates when it needs to but honestly I've only seen one video [1] that impressed me with it, the rest was completely meh. Not sure if its because it's mostly iPhone photography you see in HDR which is overall pretty meh looking anyway.

[1] https://www.youtube.com/watch?v=UwCFY6pmaYY I understand this isn't a true HDR process but someone messing with it in post, but it's the only video I've seen that noticeably shows you colors you can't see on a screen otherwise.

qingcharles 2 days ago

For the love of gods, read the reviews though. Most HDR displays will make the picture worse not better because they have only implemented enough to put a sticker on the box.

You have to spend really good money to get a display which does HDR properly.

EasyMark 2 days ago

for movies yes. for vim/vscode nope.