Maybe my response was part of the broader HDR symptom—that the acronym is overloaded with different meanings depending on where you're coming from.
On the HN frontpage, people are likely thinking of one of at least three things:
HDR as display tech (hardware)
HDR as wide gamut data format (content)
HDR as tone mapping (processing)
...
So when the first paragraph says we finally explain what HDR actually means, it set me off on the wrong foot—it comes across pretty strongly for a term that’s notoriously context-dependent. Especially in a blog post that reads like a general explainer rather than a direct Q&A response when not coming through your apps channels.
Then followed up by The first HDR is the "HDR mode" introduced to the iPhone camera in 2010. caused me to write the comment.
For people over 35 with even the faintest interest in photography, the first exposure to the HDR acronym probably didn’t arrive with the iPhone in 2010, but HDR IS equivalent to Photomatix style tone mapping starting in 2005 as even mentioned later. The ambiguity of the term is a given now. I think it's futile to insist or police one meaning other the other in non-scientific informal communication, just use more specific terminology.
So the correlation of what HDR means or what sentiment it evokes in people by age group and self-assesed photography skill might be something worthwhile to explore.
The post get's a lot better after that. That said, I really did enjoy the depth. The dive into the classic dodge and burn and the linked YouTube piece. One explainer at a time makes sense—and tone mapping is a good place to start. Even tone mapping is fine in moderation :)
I took the post about the same way. Thought it excellent because of depth.
Often, we don't get that and this topic, plus my relative ignorance on it, welcomed the post as written.
Just out of curiosity since your profile suggests your from an older cohort. Do you actively remember the Pixelmatix tone mapping era, or where you already old enough to see this as a passing fad, or was this a more niche thing than I remember?
Now I even remember the 2005 HDR HL2 Lost Coast Demo was a thing 20 years ago: https://bit-tech.net/previews/gaming/pc/hl2_hdr_overview/1/
I was old enough to see it as the passing fad it was.
Niche, style points first kind of thing for sure.
Meta: old enough that getting either a new color not intended, or an additional one visible on screen and having the machine remain able to perform was a big deal.
I missed the MDA/EGA/CGA/Hercules era and jumped right into glorious VGA. Only start options for some DOS-games informed you about that drama in the mid 90s not having any idea what that meant otherwise.
It is a fun era NOW. I love the pre VGA PC and earlier systems, like Apple 2, Atari. Am building a CGA system with a K2 CPU to go hacking on to see what was possible. I have, as do many, unfinished business :)
Back then, it was fun at times, bit was also limiting in ways sometimes hard to fathom ways.
Things are crazy good now, BTW. Almost anything is a few clicks away. The CRT is old, panels so damn good..
> "The first HDR is the "HDR mode" introduced to the iPhone camera in 2010."
Yeah, I had a full halt and process exception on that line too. I guess all the research, technical papers and standards development work done by SMPTE, Kodak, et al in the 1990s and early 2000s just didn't happen? Turns out Apple invented it all in 2010 (pack up those Oscars and Emmys awarded for technical achievement and send'em back boys!)
This post is written for people who have heard "HDR" and feel confused. That introduction lists two types of HDR people might think about. "The first" means "the first of two types we're going to explain," not "the first research in the chronological history of HDR."