HDR on displays is actually largely uncomfortable for me. They should reserve the brightest HDR whites for things like the sun itself and caustics, not white walls in indoor photos.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
Most "HDR" monitors are junk that can't display HDR. The HDR formats/signals are designed for brightness levels and viewing conditions that nobody uses.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
What we really need is some standards that everybody follows. The reason normal displays work so well is that everyone settled on sRGB, and as long as a display gets close to that, say 95% sRGB, everyone except maybe a few graphics designers will have a n equivalent experience.
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
DisplayHDR standard is supposed to be it, but they've ruined its reputation by allowing HDR400 to exist when HDR1000 should have been the minimum.
Besides, HDR quality is more complex than just max nits, because it depends on viewing conditions and black levels (and everyone cheats with their contrast metrics).
OLEDs can peak at 600 nits and look awesome — in a pitch black room. LCD monitors could boost to 2000 nits and display white on grey.
We have sRGB kinda working for color primaries and gamma, but it's not the real sRGB at 80 nits. It ended up being relative instead of absolute.
A lot of the mess is caused by the need to adapt content mastered for pitch black cinema at 2000 nits to 800-1000 nits in daylight, which needs very careful processing to preserve highlights and saturation, but software can't rely on the display doing it properly, and doing it in software sends false signal and risks display correcting it twice.
HDR is really hard to get right apparently. It seems to get worse in video games too.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.
For a year or two after it launched, The Division 2 was a really, really good example of HDR done right. The game had (has?) a day/night cycle, and it had a really good control of the brightness throughout the day. More importantly, it made very good use of the wide color gamut available to it.
I stopped playing that game for several years, and when I went back to it, the color and brightness had been wrecked to all hell. I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
Some games that I think currently have good HDR:
* Lies of P
* Hunt: Showdown 1896
* Monster Hunter: World (if you increase the game's color saturation a bit from its default settings)
Some games that had decent-to-good HDR the last time I played them, a few years ago:
* Battlefield 1
* Battlefield V
* Battlefield 2042 (If you're looking for a fun game, I do NOT recommend this one. Also, the previous two are probably chock-full of cheaters these days.)
I found Helldivers 2's HDR mode to have blacks that were WAY too bright. In SDR mode, nighttime in forest areas was dark. In HDR mode? It was as if you were standing in the middle of a field during a full moon.
> I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
A lot of people have cheap panels that claim HDR support (read: can display an HDR signal) but have garbage color space coverage, no local dimming, etc. and to them, HDR ends up looking muted.
A lot of this is poor QA. When you start to do clever things like HDR you have to test on a bunch of properly calibrated devices of different vendors etc. And if you're targeting Windows you have to accept that HDR is a mess for consumers and even if their display supports, their GPU supports it, they might still have the drivers and color profiles misconfigured. (and many apps are doing it wrong or weird, even when they say they support it)
Also (mostly) on Windows, or on videos for your TV: a lot of cheap displays that say they are HDR are a range of hot garbage.
There's a pretty good video on YouTube (more than one, actually) that explains how careless use of HDR in modern cinema is destroying the look and feel of cinema we used to like.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
The washed out grey thing was an error that became a style!
Because HDR wasn’t natively supported on most displays and software, for a long time it was just “hacked in there” by squashing the larger dynamic range into a smaller one using a mathematical transform, usually a log function. When viewed without the inverse transform this looks horribly grey and unsaturated.
Directors and editors would see this aesthetic day in, day out, with the final color grade applied only after a long review process.
Some of them got used to it and even liking it, and now here we are: horribly washed out movies made to look like that on purpose.
The transition from Avengers to the later movies is very noticeable, and one of the worst offenders since source material really speaks against the choice.
What you said, it's definitely become a style. But, also, a lot of these movies that look like ass on Joe Public's OOGLAMG $130 85" Black Friday TV in his brightly-lit living room actually look awesome if your entire setup is proper, real HDR devices and software, your screen has proper OLED or local dimming, is calibrated to within an inch of its life etc, and you view them in a dark home theater.
True! But I think that movies adapted for TV must be made for the average "good quality" screen, not the state of the art that almost nobody owns. At least they should look decent enough in a good quality (but not top-notch) setup.
Also, the YouTube video I'm thinking of singles out Wicked as seen in movie theaters. The image "as intended" looks washed out and without contrast.
I've found this especially a problem with those AI systems trying to add HDR to existing images/videos. The worst instance I've seen, was playing one of the recent Spongebob platformer games, and having his eyes glow like giant suns in the menu screen. I have a TV capable of a fairly high maximum brightness, and it was dimming the rest of the image just to make sure Spongebob's eyes lit up my living room like it was midday
It feels like to some photographers/cinematographers/game designers, HDR is a gimmick to make something look more splashy/eye catching. The article touches on this a bit, with some of the 2000s HDR examples in photography. With the rise of HDR TVs, it feels like that trend is just happening again.