Correct. I didn’t say that sentence was the source of the conflation, I said it was the source of the Ansel Adams problem. There are other parts that mix together capture, formats, and display.
Edit: and btw I am objecting to calling film capture “HDR”, I don’t think that helps define HDR nor reflects accurately on the history of the term.
That’s a strange claim because the first digital HDR capture devices were film scanners (for example the Cineon equipment used by the motion picture industry in the 1990s).
Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.
Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.
They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.
I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.
You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.
It was called "extended dynamic range" by ILM when they published the OpenEXR spec (2003):
> OpenEXR (www.openexr.net), its previously proprietary extended dynamic range image file format, to the open source community
https://web.archive.org/web/20170721234341/http://www.openex...
And "larger dynamic range" by Rea & Jeffrey (1990):
> With γ = 1 there is equal brightness resolution over the entire unsaturated image at the expense of a larger dynamic range within a given image. Finally, the automatic gain control, AGC, was disabled so that the input/output relation would be constant over the full range of scene luminances.
https://doi.org/10.1080/00994480.1990.10747942
I'm not sure when everyone settled on "high" rather than "large" or "extended", but certainly 'adjective dynamic range' is near-universal.
As I remember it, Paul Debevec had borrowed Greg Ward’s RGBE file format at some point in the late 90s and rebranded it “.hdr” for his image viewer tool (hdrView) and code to convert a stack of LDR exposures into HDR. I can see presentations online from Greg Ward in 2001 that have slides with “HDR” and “HDRI” all over the place. So yeah the term definitely must have started in the late 90s if not earlier. I’m not sure it was as there in the early 90s though.
Oo, interesting! That led me to this pair of sentences:
"Making global illumination user-friendly" (Ward, 1995) https://radsite.lbl.gov/radiance/papers/erw95.1/paper.html
> Variability is a qualitative setting that indicates how much light levels vary in the zone, i.e. the dynamic range of light landing on surfaces.
> By the nature of the situation being modeled, the user knows whether to expect a high degree of variability in the lighting or a low one.
Given those two phrases, 'a high or low degree of variability in the lighting' translates as 'a high or low degree of dynamic range' — or would be likely to, given human abbreviation tendencies, in successive works and conversations.
It certainly was called HDR when those Cineon files were processed in a linear light workflow. And film was the only capture source available that could provide sufficient dynamic range, so IMO that makes it “HDR”.
But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.
Do you have any links from 1990ish that show use of “HDR”? I am interested in when “HDR” became a phrase people used. I believe I remember hearing it first around 1996 or 97, but it may have started earlier. It was certainly common by 2001. I don’t see that used as a term nor an acronym in the Cineon docs from 1995, but it does talk about log and linear spaces and limiting the dynamic range when converting. The Cineon scanner predates sRGB, and used gamma 1.7. https://dotcsw.com/doc/cineon1.pdf
This 10 bit scanner gave you headroom of like 30% above white. So yeah it qualifies as a type of high dynamic range when compared to 8 bit/channel RGB, but on the other hand, a range of [0 .. 1.3] isn’t exactly in the spirit of what “HDR” stands for. The term implicitly means a lot more than 1.0, not just a little. And again people developing HDR like Greg Ward and Paul Debevec were arguing for absolute units such as luminance, which the Cineon scanner does not do.