munificent 3 days ago

> The claim that Ansel Adams used HDR is super likely to cause confusion

That isn't what the article claims. It says:

"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."

"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.

2
PaulHoule 2 days ago

I think about the Ansel Adams zone system

https://www.kimhildebrand.com/how-to-use-the-zone-system/

where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)

In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.

dahart 3 days ago

Literally the sentence preceding the one you quoted is “What if I told you that analog photographers captured HDR as far back as 1857?”.

zymhan 3 days ago

And that quote specifically does not "lump HDR capture, HDR formats and HDR display together".

It is directly addressing capture.

dahart 3 days ago

Correct. I didn’t say that sentence was the source of the conflation, I said it was the source of the Ansel Adams problem. There are other parts that mix together capture, formats, and display.

Edit: and btw I am objecting to calling film capture “HDR”, I don’t think that helps define HDR nor reflects accurately on the history of the term.

pavlov 2 days ago

That’s a strange claim because the first digital HDR capture devices were film scanners (for example the Cineon equipment used by the motion picture industry in the 1990s).

Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.

Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.

dahart 2 days ago

They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.

I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.

You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.

altairprime 2 days ago

It was called "extended dynamic range" by ILM when they published the OpenEXR spec (2003):

> OpenEXR (www.openexr.net), its previously proprietary extended dynamic range image file format, to the open source community

https://web.archive.org/web/20170721234341/http://www.openex...

And "larger dynamic range" by Rea & Jeffrey (1990):

> With γ = 1 there is equal brightness resolution over the entire unsaturated image at the expense of a larger dynamic range within a given image. Finally, the automatic gain control, AGC, was disabled so that the input/output relation would be constant over the full range of scene luminances.

https://doi.org/10.1080/00994480.1990.10747942

I'm not sure when everyone settled on "high" rather than "large" or "extended", but certainly 'adjective dynamic range' is near-universal.

dahart 2 days ago

As I remember it, Paul Debevec had borrowed Greg Ward’s RGBE file format at some point in the late 90s and rebranded it “.hdr” for his image viewer tool (hdrView) and code to convert a stack of LDR exposures into HDR. I can see presentations online from Greg Ward in 2001 that have slides with “HDR” and “HDRI” all over the place. So yeah the term definitely must have started in the late 90s if not earlier. I’m not sure it was as there in the early 90s though.

altairprime 2 days ago

Oo, interesting! That led me to this pair of sentences:

"Making global illumination user-friendly" (Ward, 1995) https://radsite.lbl.gov/radiance/papers/erw95.1/paper.html

> Variability is a qualitative setting that indicates how much light levels vary in the zone, i.e. the dynamic range of light landing on surfaces.

> By the nature of the situation being modeled, the user knows whether to expect a high degree of variability in the lighting or a low one.

Given those two phrases, 'a high or low degree of variability in the lighting' translates as 'a high or low degree of dynamic range' — or would be likely to, given human abbreviation tendencies, in successive works and conversations.

pavlov 2 days ago

It certainly was called HDR when those Cineon files were processed in a linear light workflow. And film was the only capture source available that could provide sufficient dynamic range, so IMO that makes it “HDR”.

But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.

dahart 2 days ago

Do you have any links from 1990ish that show use of “HDR”? I am interested in when “HDR” became a phrase people used. I believe I remember hearing it first around 1996 or 97, but it may have started earlier. It was certainly common by 2001. I don’t see that used as a term nor an acronym in the Cineon docs from 1995, but it does talk about log and linear spaces and limiting the dynamic range when converting. The Cineon scanner predates sRGB, and used gamma 1.7. https://dotcsw.com/doc/cineon1.pdf

This 10 bit scanner gave you headroom of like 30% above white. So yeah it qualifies as a type of high dynamic range when compared to 8 bit/channel RGB, but on the other hand, a range of [0 .. 1.3] isn’t exactly in the spirit of what “HDR” stands for. The term implicitly means a lot more than 1.0, not just a little. And again people developing HDR like Greg Ward and Paul Debevec were arguing for absolute units such as luminance, which the Cineon scanner does not do.

munificent 2 days ago

Yes, Ansel Adams was using a camera to capture a scene that had high dynamic range.

I don't see the confusion here.

dahart 2 days ago

HDR is not referring to the scene’s range, and it doesn’t apply to film. It’s referring superficially but specifically to a digital process that improves on 8 bits/channel RGB images. And one of the original intents behind HDR was to capture pixels in absolute physical measurements like radiance, to enable a variety of post-processing workflows that are not available to film.

munificent 2 days ago

"High dynamic range" is a phrase that is much older than tone mapping. I see uses of "dynamic range" going back to the 1920s and "high dynamic range" to the 1940s:

https://books.google.com/ngrams/graph?content=dynamic+range%...

You might argue that "HDR" the abbreviation refers to using tone mapping to approximate rendering high dynamic range imagery on lower dynamic range displays. But even then, the sentence in question doesn't use the abbreviation. It is specifically talking about a dynamic range that is high.

Dynamic range is a property of any signal or quantifiable input, including, say sound pressure hitting our ears or photons hitting an eyeball, film, or sensor.

dahart 2 days ago

> But even then, the sentence in question doesn’t use the abbreviation

Yes it does. Why are you still looking at a different sentence than the one I quoted??

HDR in this context isn’t referring to just any dynamic range. If it was, then it would be so vague as to be meaningless.

Tone mapping is closely related to HDR and very often used, but is not necessary and does not define HDR. To me it seems like your argument is straw man. Photographers have never broadly used the term “high dynamic range” as a phrase, nor the acronym “HDR” before it showed up in computer apps like hdrView, Photoshop, and iPhone camera.

munificent 2 days ago

Oh, sorry, you're right. Mentioning the abbreviation is a red herring. The full quote is:

"But what if we don't need that tradeoff? What if I told you that analog photographers captured HDR as far back as 1857? Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes. It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!"

It seems pretty clear to me that in this context the author is referring to the high dynamic range of the scenes that Adams pointed his camera at. That's why he says "captured HDR" and "high dynamic range scenes".

dahart 2 days ago

> It seems pretty clear to me that in this context the author is referring to the high dynamic range of the scenes that Adams pointed his camera at.

Yes, this is the problem I have with the article. “HDR” is not characterized solely by the range of the scene, and never was. It’s a term of art that refers to an increased range (and resolution) on the capture and storage side, and it’s referring to a workflow that involves/enables deferring exposure until display time. The author’s claim here is making the term “HDR” harder to understand, not easier, and it’s leaving out of the most important conceptual aspects. There are some important parallels between film and digital HDR, and there are some important differences. The differences are what make claiming that nineteenth century photographers were capturing HDR problematic and inaccurate.

samplatt 2 days ago

To further complicate the issue, "high dynamic range" is a phrase that will come up across a few different disciplines, not just related to the capture & reproduction of visual data.

altairprime 2 days ago

The digital process of tonemapping, aka. 'what Apple calls Smart HDR processing of SDR photos to increase perceptual dynamic range', can be applied to images of any number of channels of any bit depth — though, if you want to tonemap a HyperCard dithered black-and-white image, you'll probably have to decompile the dithering as part of creating the gradient map. Neither RGB nor 8-bit are necessary to make tonemapping a valuable step in image processing.

dahart 2 days ago

That’s true, and it’s why tonemapping is distinct from HDR. If you follow the link from @xeonmc’s comment and read the comments, the discussion centers on the conflation of tonemapping and HDR.

https://news.ycombinator.com/item?id=43987923

That said, the entire reason that tonemapping is a thing, and the primary focus of the tonemapping literature, is to solve the problem of squeezing images with very wide ranges into narrow display ranges like print and non-HDR displays, and to achieve a natural look that mirrors human perception of wide ranges. Tonemapping might be technically independent of HDR, but they did co-evolve, and that’s part of the history.