dahart 3 days ago

It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.

9
arghwhat 2 days ago

Arguably, even considering HDR a distinct thing is itself weird an inaccurate.

All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.

HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.

dahart 2 days ago

This! Yes I think you’re absolutely right. The term “HDR” is in part kind of an artifact of how digital image formats evolved, and it kind of only makes sense relative to a time when the most popular image formats and most common displays were not very sophisticated about colors.

That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.

arghwhat 2 days ago

Absolute is also a funny size. From the perspective of human visual perception, an absolute brightness only matters if the entire viewing environment is also controlled to the same absolute values. Visual perception is highly contextual, and we are not only seeing the screen.

It's not fun being unable to watch dark scenes during the day or evening in a living room, nor is vaporizing your retinas if the ambient environment went dark in the meantime. People want good viewing experience in the available environment that is logically similar to what the content intended, but that is not always the same as reproducing the exact same photons as the directors's mastering monitor sent towards their their eyeballs at the time of production.

tshaddox 2 days ago

Indeed. For a movie scene depicting the sky including the Sun, you probably wouldn't want your TV to achieve the same brightness as the Sun. You might want your TV to become significantly brighter than the rest of the scenes, to achieve an effect something like the Sun catching your eye.

Of course, the same thing goes for audio in movies. You probably want a gunshot or explosion to sound loud and even be slightly shocking, but you probably don't want it to be as loud as a real gunshot or explosion would be from the depicted distance.

The difference is that for 3+ decades the dynamic range of ubiquitous audio formats (like 16 bit PCM in audio CDs and DVDs) has provided far more dynamic range than is comfortably usable in normal listening environments. So we're very familiar with audio being mastered with a much smaller dynamic range than the medium supports.

dahart 2 days ago

Yep, absolutely! ;)

This brings up a bunch of good points, and it tracks with what I was trying to say about conflating HDR processing with HDR display. But do keep in mind that even when you have absolute value images, that doesn’t imply anything about how you display them. You can experience large benefits with an HDR workflow, even when your output or display is low dynamic range. Assume that there will be some tone mapping process happening and that the way you map tones depends on the display medium and its capabilities, and on the context and environment of the display. Using the term “HDR” shouldn’t imply any mismatch or disconnect in the viewing environment. It only did so in the article because it wasn’t very careful about its terms and definitions.

tshaddox 2 days ago

The term "HDR" arguably makes more sense for the effect achieved by tone mapping multiple exposures of the same subject onto a "normal" (e.g. SRGB) display. In this case, the "high" in "HDR" just means "from a source with higher dynamic range than the display."

BlueTemplar 2 days ago

Remember "wide gamut" screens ?

This is part of 'HDR' standards too...

And it's quite annoying that 'HDR' (and which specific one ?) is treated as just being 'on' or 'off' even for power users...

theshackleford 2 days ago

> but your monitor manages 1000-1500 and only in a small window.

Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.

There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.

Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.

arghwhat 2 days ago

Motion resolution? Do you mean the pixel response time?

CRTs technically have quite a few artifacts in this area, but as content displayed CRTs tend to be built for CRTs this is less of an issue, and in many case even required. The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

The aggressive OLED ABL is simply a thermal issue. It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

(LCD with zone dimming would also be able to pull this trick to get even brighter zones, but because the base brightness is high enough it doesn't bother.)

theshackleford 1 day ago

> Motion resolution? Do you mean the pixel response time?

I indeed meant motion resolution, which pixel response time only partially affects. It’s about how clearly a display shows motion, unlike static resolution which only reflects realistically a still image. Even with fast pixels, sample and hold displays blur motion unless framerate and refresh rate is high, or BFI/strobing is used. This blur immediately lowers perceived resolution the moment anything moves on screen.

> The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

That's true for many CRT purists, but is not a huge deal for me personally. My focus is motion performance. If LCD/OLED matched CRT motion at the same refresh rate, I’d drop CRT in a heartbeat, slap on a CRT shader, and call it a day. Heresy to many CRT enthusiasts.

Ironically, this is an area in which I feel we are getting CLOSE enough with the new higher refresh OLEDs for non HDR retro content in combination with: https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks... (which hopefully will continue to be improved.)

> The aggressive OLED ABL is simply a thermal issue.

Theoretically, yes and there’s been progress, but it’s still unsolved in practice. If someone shipped an OLED twice as thick and full of fans and heatsinks, I’d buy it tomorrow. But that’s not what the market wants, so obviously it's not what they make.

> It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

Sure, in theory. But so far the improvements (like QD-OLED or MLA) haven’t gone far enough. I already own panels using these. Beyond that, much of the tech isn’t in the display types I care about, or isn’t ready yet. Which is a pity, because the tandem based displays I have seen in usage are really decent.

That said, the latest G5 WOLEDs are the first I’d call acceptable for HDR at high APL, for the preferences I hold with very decent real scene brightness, at least in film. Sadly, I doubt we’ll see comparable performance in PC monitors until many years down the track and monitors are my preference.

lotrjohn 2 days ago

Hello fellow CRT owner. What is your use case? Retro video games? PC games? Movies?

theshackleford 2 days ago

Hello indeed!

> What is your use case? Retro video games? PC games? Movies?

All of the above! The majority of my interest largely stems from the fact that for whatever reason, I am INCREDIBLY sensitive to sample and hold motion blur. Whilst I tolerate it for modern gaming because I largely have no choice, CRT's mean I do not for my retro gaming, which I very much enjoy. (I was very poor growing up, so most of it for me is not even nostalgia, most of these games are new to me.)

Outside of that, we have a "retro" corner in our home with a 32" trinitron. I collect laserdisc/VHS and we have "retro video" nights where for whatever reason, we watch the worst possible quality copies of movies we could get in significantly higher definition. Much the same as videogames, I was not exposed to a lot of media growing up, my wife has also not seen many things because she was in Russia back then, so there is a ton for us to catch up on very slowly and it just makes for a fun little date night every now and again.

Sadly though, as I get ready to take on a mortgage, it's likely most of my CRT's will be sold, or at least the broadcast monitors. I do not look forward to it haha.

lotrjohn 2 days ago

> Outside of that, we have a "retro" corner in our home with a 32" trinitron.

A 32” Trinny. Nice. I have the 32” JVC D-series which I consider my crown jewel. It’s for retro gaming and I have a laserdisc player but a very limited selection of movies. Analog baby.

> Sadly though, as I get ready to take on a mortgage, it's likely most of my CRT's will be sold

Mortgage = space. You won’t believe the nooks and crannies you can fit CRTs into. Attic. Shed. Crawl space. Space under basement stairs. Heck, even the neighbors house. I have no less than 14 CRTs ferreted away in the house. Wife thinks I have only 5. Get creative. Don’t worry about the elements, these puppies were built to survive nuclear blasts. Do I have a sickness? Probably. But analog!!!

hypercube33 2 days ago

Speaking of laser disc it's wild how vivid colors are on that platform. My main example movie is Star Trek First contact and everything is very colorful. DVD is muddy. Even a Blu-ray copy kinda looks like crap. A total side note is the surround sound for that movie is absolutely awesome especially the cube battle scene.

theshackleford 1 day ago

> I have the 32” JVC D-series which

I would love one of these however I have never seen one in my country. Super jealous haha! The tubes they use apparently were an american made tube, with most of the JVCs that were released in my country using different tubes than those released in the US market.

That being said, I do own two JVC "broadcast" monitors that I love. A 17" and a 19". They are no D-series real "TV" but.

QuantumGood 3 days ago

Adams adjusted heavily with dodging and burning, even working to invent a new chemical process to provide more control when developing. He was great at determining exposure for his process as well. A key skill was having a vision for what the image would be after adjusting. Adams talked a lot about this as a top priority of his process.

Demiurge 2 days ago

> It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!

I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.

jrapdx3 2 days ago

Indeed so. Printing on paper and other substrates is inherently subtractive in nature which limits the gamut of colors and values that can be reproduced. Digital methods make the job of translating additive to subtractive media easier vs. the analog techniques available to film photographers. In any case, the image quality classic photography was able to achieve is truly remarkable.

Notably, the dodging and burning used by photographers aren't obsolete. There's a reason these tools are included in virtually every image-editing program out there. Manipulating dynamic range, particularly in printed images, remains part of the craft of image-making.

staticautomatic 2 days ago

Contact printing on azo certainly helped!

munificent 2 days ago

> The claim that Ansel Adams used HDR is super likely to cause confusion

That isn't what the article claims. It says:

"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."

"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.

PaulHoule 2 days ago

I think about the Ansel Adams zone system

https://www.kimhildebrand.com/how-to-use-the-zone-system/

where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)

In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.

dahart 2 days ago

Literally the sentence preceding the one you quoted is “What if I told you that analog photographers captured HDR as far back as 1857?”.

zymhan 2 days ago

And that quote specifically does not "lump HDR capture, HDR formats and HDR display together".

It is directly addressing capture.

dahart 2 days ago

Correct. I didn’t say that sentence was the source of the conflation, I said it was the source of the Ansel Adams problem. There are other parts that mix together capture, formats, and display.

Edit: and btw I am objecting to calling film capture “HDR”, I don’t think that helps define HDR nor reflects accurately on the history of the term.

pavlov 2 days ago

That’s a strange claim because the first digital HDR capture devices were film scanners (for example the Cineon equipment used by the motion picture industry in the 1990s).

Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.

Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.

dahart 2 days ago

They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.

I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.

You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.

altairprime 2 days ago

It was called "extended dynamic range" by ILM when they published the OpenEXR spec (2003):

> OpenEXR (www.openexr.net), its previously proprietary extended dynamic range image file format, to the open source community

https://web.archive.org/web/20170721234341/http://www.openex...

And "larger dynamic range" by Rea & Jeffrey (1990):

> With γ = 1 there is equal brightness resolution over the entire unsaturated image at the expense of a larger dynamic range within a given image. Finally, the automatic gain control, AGC, was disabled so that the input/output relation would be constant over the full range of scene luminances.

https://doi.org/10.1080/00994480.1990.10747942

I'm not sure when everyone settled on "high" rather than "large" or "extended", but certainly 'adjective dynamic range' is near-universal.

dahart 2 days ago

As I remember it, Paul Debevec had borrowed Greg Ward’s RGBE file format at some point in the late 90s and rebranded it “.hdr” for his image viewer tool (hdrView) and code to convert a stack of LDR exposures into HDR. I can see presentations online from Greg Ward in 2001 that have slides with “HDR” and “HDRI” all over the place. So yeah the term definitely must have started in the late 90s if not earlier. I’m not sure it was as there in the early 90s though.

altairprime 2 days ago

Oo, interesting! That led me to this pair of sentences:

"Making global illumination user-friendly" (Ward, 1995) https://radsite.lbl.gov/radiance/papers/erw95.1/paper.html

> Variability is a qualitative setting that indicates how much light levels vary in the zone, i.e. the dynamic range of light landing on surfaces.

> By the nature of the situation being modeled, the user knows whether to expect a high degree of variability in the lighting or a low one.

Given those two phrases, 'a high or low degree of variability in the lighting' translates as 'a high or low degree of dynamic range' — or would be likely to, given human abbreviation tendencies, in successive works and conversations.

pavlov 2 days ago

It certainly was called HDR when those Cineon files were processed in a linear light workflow. And film was the only capture source available that could provide sufficient dynamic range, so IMO that makes it “HDR”.

But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.

dahart 2 days ago

Do you have any links from 1990ish that show use of “HDR”? I am interested in when “HDR” became a phrase people used. I believe I remember hearing it first around 1996 or 97, but it may have started earlier. It was certainly common by 2001. I don’t see that used as a term nor an acronym in the Cineon docs from 1995, but it does talk about log and linear spaces and limiting the dynamic range when converting. The Cineon scanner predates sRGB, and used gamma 1.7. https://dotcsw.com/doc/cineon1.pdf

This 10 bit scanner gave you headroom of like 30% above white. So yeah it qualifies as a type of high dynamic range when compared to 8 bit/channel RGB, but on the other hand, a range of [0 .. 1.3] isn’t exactly in the spirit of what “HDR” stands for. The term implicitly means a lot more than 1.0, not just a little. And again people developing HDR like Greg Ward and Paul Debevec were arguing for absolute units such as luminance, which the Cineon scanner does not do.

munificent 2 days ago

Yes, Ansel Adams was using a camera to capture a scene that had high dynamic range.

I don't see the confusion here.

dahart 2 days ago

HDR is not referring to the scene’s range, and it doesn’t apply to film. It’s referring superficially but specifically to a digital process that improves on 8 bits/channel RGB images. And one of the original intents behind HDR was to capture pixels in absolute physical measurements like radiance, to enable a variety of post-processing workflows that are not available to film.

munificent 2 days ago

"High dynamic range" is a phrase that is much older than tone mapping. I see uses of "dynamic range" going back to the 1920s and "high dynamic range" to the 1940s:

https://books.google.com/ngrams/graph?content=dynamic+range%...

You might argue that "HDR" the abbreviation refers to using tone mapping to approximate rendering high dynamic range imagery on lower dynamic range displays. But even then, the sentence in question doesn't use the abbreviation. It is specifically talking about a dynamic range that is high.

Dynamic range is a property of any signal or quantifiable input, including, say sound pressure hitting our ears or photons hitting an eyeball, film, or sensor.

dahart 2 days ago

> But even then, the sentence in question doesn’t use the abbreviation

Yes it does. Why are you still looking at a different sentence than the one I quoted??

HDR in this context isn’t referring to just any dynamic range. If it was, then it would be so vague as to be meaningless.

Tone mapping is closely related to HDR and very often used, but is not necessary and does not define HDR. To me it seems like your argument is straw man. Photographers have never broadly used the term “high dynamic range” as a phrase, nor the acronym “HDR” before it showed up in computer apps like hdrView, Photoshop, and iPhone camera.

munificent 2 days ago

Oh, sorry, you're right. Mentioning the abbreviation is a red herring. The full quote is:

"But what if we don't need that tradeoff? What if I told you that analog photographers captured HDR as far back as 1857? Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes. It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!"

It seems pretty clear to me that in this context the author is referring to the high dynamic range of the scenes that Adams pointed his camera at. That's why he says "captured HDR" and "high dynamic range scenes".

dahart 2 days ago

> It seems pretty clear to me that in this context the author is referring to the high dynamic range of the scenes that Adams pointed his camera at.

Yes, this is the problem I have with the article. “HDR” is not characterized solely by the range of the scene, and never was. It’s a term of art that refers to an increased range (and resolution) on the capture and storage side, and it’s referring to a workflow that involves/enables deferring exposure until display time. The author’s claim here is making the term “HDR” harder to understand, not easier, and it’s leaving out of the most important conceptual aspects. There are some important parallels between film and digital HDR, and there are some important differences. The differences are what make claiming that nineteenth century photographers were capturing HDR problematic and inaccurate.

samplatt 2 days ago

To further complicate the issue, "high dynamic range" is a phrase that will come up across a few different disciplines, not just related to the capture & reproduction of visual data.

altairprime 2 days ago

The digital process of tonemapping, aka. 'what Apple calls Smart HDR processing of SDR photos to increase perceptual dynamic range', can be applied to images of any number of channels of any bit depth — though, if you want to tonemap a HyperCard dithered black-and-white image, you'll probably have to decompile the dithering as part of creating the gradient map. Neither RGB nor 8-bit are necessary to make tonemapping a valuable step in image processing.

dahart 2 days ago

That’s true, and it’s why tonemapping is distinct from HDR. If you follow the link from @xeonmc’s comment and read the comments, the discussion centers on the conflation of tonemapping and HDR.

https://news.ycombinator.com/item?id=43987923

That said, the entire reason that tonemapping is a thing, and the primary focus of the tonemapping literature, is to solve the problem of squeezing images with very wide ranges into narrow display ranges like print and non-HDR displays, and to achieve a natural look that mirrors human perception of wide ranges. Tonemapping might be technically independent of HDR, but they did co-evolve, and that’s part of the history.

sandofsky 2 days ago

> It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things.

These are all related things. When you talk about color, you can be talking about color cameras, color image formats, and color screens, but the concept of color transcends the implementation.

> The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

The post never said Adams used HDR. I very carefully chose the words, "capturing dramatic, high dynamic range scenes."

> Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact.

This is just factually wrong. Film negatives have 12-stops of useful dynamic range, while photo paper has 8 stops at best. That gave photographers exposure latitude during the print process.

> Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later.

There's a photo of Ansel Adams in the article, dodging and burning a print. How would you describe that if not adjusting the exposure?

smogcutter 2 days ago

> Film negatives have 12-stops of useful dynamic range

No, that’s not inherently true. AA used 12 zones, that doesn’t mean every negative stock has 12 stops of latitude. Stocks are different, you need to look at the curves.

But yes most modern negatives are very forgiving. FP4 for example has barely any shoulder at all iirc.

dahart 2 days ago

I agree capture, format and display are closely related. But HDR capture and processing specifically developed outside of HDR display devices, and use of HDR displays changes how HDR images are used compared to LDR displays.

> The post never said Adams used HDR. I very carefully chose the words

Hey I’m sorry for criticizing, but I honestly feel like you’re being slightly misleading here. The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture, and the Ansel Adams sentence that follows appears to be merely a specific example of your claim. The result of the juxtaposition is that the article did in fact claim Adams used HDR, even if you didn’t quite intend to.

I think you’re either misunderstanding me a little, or maybe unaware of some of the context of HDR and its development as a term of art in the computer graphics community. Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from. The more important part of HDR was the intent to push toward absolute physical units like luminance. That doesn’t just enable deferred exposure, it enables physical and perceptual processing in ways that aren’t possible with film. It enables calibrated integration with CG simulation that isn’t possible with film. And it enables a much wider rage of exposure push/pull than you can do when going from 12 stops to 8. And of course non-destructive digital deferred exposure at display time is quite different from a print exposure.

Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB. With analog photography, there is no LDR, thus zero reason to invent the notion of a ‘higher’ range. Higher than what? High relative to what? Analog cameras have exposure control and thus can capture any range you want. There is no ‘high’ range in analog photos, there’s just range. HDR was invented to push against and evolve beyond the de-facto digital practices of the 70s-90s, it is not a statement about what range can be captured by a camera.

sandofsky 2 days ago

> The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture,

No, it isn't. It's saying they captured HDR scenes.

> The result of the juxtaposition is that the article did in fact claim Adams used HDR

You can't "use" HDR. It's an adjective, not a noun.

> Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from.

The Reinhard tone mapper, a benchmark that regularly appears in research papers, specifically cites Ansel Adams as inspiration.

"A classic photographic task is the mapping of the potentially high dynamic range of real world luminances to the low dynamic range of the photographic print."

https://www-old.cs.utah.edu/docs/techreports/2002/pdf/UUCS-0...

> Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB.

8-bits per channel does not describe dynamic range. If I attach an HLG transfer function on an 8-bit signal, I have HDR. Furthermore, assuming you actually meant 8-bit sRGB, nobody calls that "LDR." It's SDR.

> Analog cameras have exposure control and thus can capture any range you want.

This sentence makes no sense.

dahart 2 days ago

Sorry man, you seem really defensive, I didn’t mean to put you on edge. Okay, if you are calling the scenes “HDR” then I’m happy to rescind my critique about Ansel Adams and switch instead to pointing out that “HDR” doesn’t refer to the range of the scene, it refers to the range capability of a digital capture process. I think the point ultimately ends up being the same either way. Hey where is HDR defined as an adjective? Last time I checked, “range” could be a noun, I think… no? You must be right, but FWIW, you used HDR as a noun in your 2nd to last point… oh and in the title of your article too.

Hey it’s great Reinhard was inspired by Adams. I have been too, like a lot of photographers. And I’ve used the Reinhard tone mapper in research papers, I’m quite familiar with it and personally know all three authors of that paper. I’ve even written a paper or maybe two on color spaces with one of them. Anyway, the inspiration doesn’t change the fact that 12 stops isn’t particularly high dynamic range. It’s barely more than SDR. Even the earliest HDR formats had like 20 or 30 stops, in part because the point was to use physical luminance instead of a relative [0..1] range.

8 bit RGB does sort-of in practice describe a dynamic range, as long as the 1 bit difference is approximately the ‘just noticeable difference’ or JND as some researchers call it. This happens to line up with 8 bits being about 8 stops, which is what RGB images have been doing for like 50 years, give or take. While it’s perfectly valid arithmetic to use 8 bits values to represent an arbitrary amount like 200 stops or 0.003 stops, it’d be pretty weird.

Plenty of people have called and continue to call 8 bit images “LDR”, here’s just three of the thousands of uses of “LDR” [1][2][3], and LDR predates usage of SDR by like 15 years maybe? LDR predates sRGB too, I did not actually mean 8 bit sRGB. LDR and SDR are close but not quite the same thing, so feel free to read up on LDR. It’s disappointing you ducked the actual point I was making, which is still there even if you replace LDR with SDR.

What is confusing about the sentence about analog cameras and exposure control? I’m happy to explain it since you didn’t get it. I was referring to how the aperture can be adjusted on an analog camera to make a scene with any dynamic range fit into the ~12 stops of range the film has, or the ~8 stops of range of paper or an old TV. I was just trying to clarify why HDR is an attribute of digital images, and not of scenes.

[1] https://www.easypano.com/showkb_228.html#:~:text=The%20Dynam...

[2] https://www.researchgate.net/figure/shows-digital-photograph...

[3] https://irisldr.github.io/

sandofsky 2 days ago

You opened this thread arguing that Ansel Adams didn't "use HDR." I linked you to a seminal research paper which argues that he tone mapped HDR content, and goes on to implement a tone mapper based on his approach. This all seems open and shut.

> I’m happy to rescind my critique about Ansel Adams

Great, I'm done.

> and switch instead to pointing out that “HDR” doesn’t refer to the range of the scene

Oh god. Here's the first research paper that popped into my head: https://static.googleusercontent.com/media/hdrplusdata.org/e...

"Surprisingly, daytime shots with high dynamic range may also suffer from lack of light."

"In low light, or in very high dynamic range scenes"

"For high dynamic range scenes we use local tone mapping"

You keep trying to define "HDR" differently than current literature. Not even current— that paper was published in 2016! Hey, maybe HDR meant something different in the 1990s, or maybe it was just ok to use "HDR" as shorthand for when things were less ambiguous. I honestly don't care, and you're only serving to confuse people.

> the aperture can be adjusted on an analog camera to make a scene with any dynamic range fit into the ~12 stops of range the film has, or the ~8 stops of range of paper or an old TV.

You sound nonsensical because you keep using the wrong terms. Going back to your first sentence that made no sense:

> Analog cameras have exposure control and thus can capture any range you want

You keep saying "range" when, from what I can tell, you mean "luminance." Changing a camera's aperture scales the luminance hitting your film or sensor. It does not alter the dynamic range of the scene.

Analog cameras cannot capture any range. By adjusting camera settings or attaching ND filters, you can change the window of luminance values that will fit within the dynamic range of your camera. To say a camera can "capture any range" is like saying, "I can fit that couch through the door, I just have to saw it in half."

> And I’ve used the Reinhard tone mapper in research papers, I’m quite familiar with it and personally know all three authors of that paper. I’ve even written a paper or maybe two on color spaces with one of them.

I'm sorry if correcting you triggers insecurities, but if you're going to make an appeal to authority, please link to your papers instead of hand waving about the people you know.

dahart 1 day ago

Hehe outside is “HDR content”? To me that still comes off as confused about what HDR is. I know you aren’t, but that’s what it sounds like. A sunny day has a high dynamic range for sure, but the acronym HDR is a term of art that implies more than that. Your article even explains why.

Tone mapping doesn’t imply HDR. Tone mapping is always present, even in LDR and SDR workflows. The paper you cited explicitly notes the idea is to “extend” Adams’ zone system to very high dynamic range digital images, more than what Adams was working with, by implication.

So how is a “window of luminance values” different from a dynamic range, exactly? Why did you make the incorrect and obviously silly assumption that I was suggesting a camera’s aperture changes the outdoor scene’s dynamic range rather than what I actually said, that it changes the exposure? Your description of what a camera does is functionally identical. I’m kinda baffled as to why you’re arguing this part that we both understand, using hyperbole.

I hope you have a better day tomorrow. Good luck with your app. This convo aside, I am honestly rooting for you.

sandofsky 1 day ago

> Hehe outside is “HDR content”? To me that still comes off as confused about what HDR is.

"Surprisingly, daytime shots with high dynamic range may also suffer from lack of light."

That's from, "Burst photography for high dynamic range and low-light imaging on mobile cameras," written by some of the most respected researchers in computational photography. It has 342 citations according to ACM.

I'm still waiting for a link to your papers.

> Tone mapping doesn’t imply HDR.

https://en.wikipedia.org/wiki/Tone_mapping

First sentence: "Tone mapping is a technique used in image processing and computer graphics to map one set of colors to another to approximate the appearance of high-dynamic-range (HDR) images in a medium that has a more limited dynamic range."

> Why did you make the incorrect and obviously silly assumption that I was suggesting a camera’s aperture changes the outdoor scene’s dynamic range rather than what I actually said, that it changes the exposure?

Because you keep bumbling details like someone with a surface level understanding. Your replies are irrelevant, outdated, or flat out wrong. It all gives me flashbacks to working under engineers-turned-managers who just can't let go, forcing their irrelevant backgrounds into discussions.

It's cool that you studied late 90s 3D rendering. So did I. It doesn't make you an expert in computational photography. Please stop confusing people with your non-sequiturs.

dahart 1 day ago

What does the lack of light quote prove? That’s a statement about color resolution, not range, and it uses “high dynamic range” and not “HDR content”. I think you’ve missed my point and are not listening.

Yes tone mapping is used on HDR images. It just doesn’t imply HDR. SDR gamma is tone mapping, for example, which the Wikipedia link you sent explains. Your claim is that Adams use of tone mapping is evidence that he is capturing “HDR content”. The paper you sent doesn’t use that language, it doesn’t ever say Adams was doing tone mapping, it says they develop a tone mapping method inspired by Adams’ zone system that extends the idea into higher dynamic range.

You’re using your own misunderstanding and mis-interpretation of my comments as evidence that they’re wrong. Hey I totally might be wrong about a lot of things, and sure maybe I’m completely non-sensical, but you certainly haven’t convinced me of that. I haven’t had trouble speaking with other people about HDR imaging, people who are HDR experts. All I’m getting out of this so far is that some people react very badly to any hint of critique.

From my perspective, I’m also only hearing bumbling errors, errors like that HDR is an adjective, that LDR doesn’t exist and nobody uses it, that using “range” is incorrect when I say it but not when you do and “window of luminance values” is better, and that Ansel Adams was doing HDR imaging.

Ben, we’re having a bona-fide miscommunication, and I wanted to fix it but I’m failing, and it feels like you’re determined not to fix it or find any common ground. In another environment we’d probably be having a friendly, productive and enlightening conversation. I’m sure there are some things I could learn from you.

albumen 3 days ago

But the article even shows Adams dodging/burning a print, which is 'adjusting the exposure' in a localised fashion of the high dynamic range of the film, effectively revealing detail for the LDR of the resulting print that otherwise wouldn't have been visible.

xeonmc 2 days ago

> It seems like a mistake to lump HDR capture, HDR formats and HDR display together

Reminded me of the classic "HDR in games vs HDR in photography" comparison[0]

[0] https://www.realtimerendering.com/blog/thought-for-the-day/

pixelfarmer 2 days ago

If I look at one of the photography books in my shelf, they are even talking about 18 stops and such for some film material, and how this doesn't translate to paper and all the things that can be done to render it visible in print and how things behave at both extreme ends (towards black and white). Read: Tone-mapping (i.e. trimming down a high DR image to a lower DR output media) is really old.

The good thing about digital is that it can deal with color at decent tonal resolutions (if we assume 16 bits, not the limited 14 bit or even less) and in environments where film has technical limitations.

Sharlin 2 days ago

No, Adams, like everyone who develops their own film (or RAW digital photos) definitely worked in HDR. Film has much more DR than photographic paper, as noted by TFA author (and large digital sensors more than either SDR or HDR displays) especially if you’re such a master of exposure as Adams; trying to preserve the tonalities when developing and printing your photos is the real big issue.

levidos 2 days ago

Is there a difference in capturing in HDR vs RAW?

dahart 2 days ago

Good question. I think it depends. They are kind of different concepts, but in practice they can overlap considerably. RAW is about using the camera’s full native color resolution, and not having lossy compression. HDR is overloaded, as you can see from the article & comments, but I think HDR capture is conceptually about expressing brightness in physical units like luminance or radiance, and delaying the ‘exposure’ until display time. Both RAW and HDR typically mean using more than 8 bits/channel and capturing high quality images that will withstand more post-processing than ‘exposed’ LDR images can handle.