Home Artists Posts Import Register

Content

Another guest video by Nathan Vegdahl! 

A continuation of part one, here, on a deep dive into light, cameras, rendering, and how computers handle color. If you've ever wanted to wrap your head around a proper color workflow, Nathan's videos are a fantastic way to start.

Also his visualizations in this one are fantastic. 

Files

Everything I Know About Color, Part 2 - Black levels

Another guest video by Nathan Vegdahl! A continuation of part one, here, https://www.patreon.com/posts/everything-knows-60609262 on a deep dive into light, cameras, rendering, and how computers handle color. If you've ever wanted to wrap your head around a proper color workflow, Nathan's videos are a fantastic way to start (although as he points out, we haven't even gotten to the color part of the color series).

Comments

Anonymous

was waiting for this. awesome!

Anonymous

Wonderful stuff, thanks!

Anonymous

Absolutely brilliant as always Nathan.

Anonymous

YAY thanks so much, Nathan!!!!!

Anonymous

One thing I'm confused about is how subtracting noise levels in bayer produces a better result than subtracting noise values in RGB. Shouldn't subtracting the noise level from four given pixels (if using a bilinear debayer) and averaging them produce the exact same result as averaging four pixels, THEN subtracting the noise floor?

Anonymous

I believe you are correct, yes, that debayering shouldn't have any effect on this as long as the interpolation is linear. But that's not what I meant when I made the raw vs rgb distinction. Although the color filters of the bayer filter are indeed red, green, and blue, they aren't RGB primaries in the same sense as the triangle-shaped color spaces we're used to seeing. The color space created by the bayer filter is actually its own horseshoe-shaped color space (much like human color vision), and at some point someone has to convert that color space to something that corresponds to human color vision. When a camera records to an RGB (triangle) color space instead of raw, it's doing that conversion for you. And that conversion can involve transformations that make noise level subtraction not the same afterwards as beforehand. I probably could have made it clearer what I meant by "raw" vs "rgb" in that footnote. But the footnotes section was already longer than I really wanted. And since I'll be covering all of this stuff later in the series anyway, I figured keeping it stripped down was probably okay. Edit: oh! And I got so caught up in trying to answer your question, I forgot to say: thanks for the kind words!

Anonymous

I see, I'm assuming that the bayer's horseshoe color space is due to crosstalk? And thus an algorithm that corrects the crosstalk will have an effect on the encoded noise values??? I should probably just keep my mouth closed and wait for the series to finish :P

Robin Ruud

This is incredibly useful, Nathan. Thank you!

Anonymous

Hey Ian, I hope you like this. It's photogrammetry dude, that scan classic Russian architecture. Might be cool, to use his assets) https://instagram.com/alexandiir?utm_medium=copy_link

Anonymous

Great material. Couple things to add to the discussion As far as I understand the legal range is mostly for legacy reasons, compatibility with analog signals, new standards like Rec2020 go with full range. It is used for YCbCr signals, not so much for RGB and should be scaled to full range on modern displays automatically. But when you went with lowering the black levels to get it to full range, you should also stretch the signal to bring whites to full range. Anyway it's a mess, here is a post explaining it a little bit https://www.thepostprocess.com/2019/09/24/how-to-deal-with-levels-full-vs-video/ The curves you showed as LOG for broadcast are PQ (Perceptual Quantization) and HLG (Hybrid Log Gamma) they were developed for HDR signals and the log nature allows to get more brightness and black data while still focusing on what's the most important which is midrange details, they are mapped by display device depending on it's abilities (PQ can go up to 10000nits, while so far most HDR displays are around 600 - 1000nits) The debayering is done different by each manufacturer, ARRI as you said has a very nice and natural noise feel, RED on a contrary has a very sharp and digital looking noise, this also comes to sensor density, if you pack too many points on a sensor they start interfering with each other introducing even more noise, so more megapixels is not always better. Here is a great link with lots of information and examples about color science, management and such https://nick-shaw.github.io/cinematiccolor/description.html

Anonymous

Thanks for adding to the discussion, Sanki. > As far as I understand the legal range is mostly for [...] compatibility with analog signals That was my assumption for a long time as well, and it's the explanation I see a lot of people give. But in doing the research for this video I couldn't find any primary sources actually stating that, so I'm not so sure anymore. The closest thing to an official explanation I found was in the Rec.601 document (which I think is the first digital broadcast video standard?), which calls the areas above and below the legal range "working margins". But that's pretty vague. > new standards like Rec2020 go with full range You're almost right. Rec.2020 doesn't use full range, but Rec.2100 gives the *option* to use full range, in addition to also still supporting legal range. (Hopefully the next standard will get rid of legal ranges altogether!) > But when you went with lowering the black levels to get it to full range, you should also stretch the signal to bring whites to full range. Yeah, I intentionally skipped over that in the video because it's not relevant to black levels. It's also not always necessary, because it just ends up being a constant scale/gain factor anyway. But when you do need to fully re-expand back to full range then yes, you're 100% right. (Edit: and in retrospect, I should have included a footnote about this!) > The curves you showed as LOG for broadcast are PQ (Perceptual Quantization) and HLG (Hybrid Log Gamma) If you mean the curves with the negative values, those are actually two of the log transfer functions from the manufacturer of the camera I did the demo with. They're definitely not PQ and HLG. I generated the graph myself from code I wrote implementing a whole bunch of different transfer functions. Interestingly, neither PQ nor HLG actually incorporate legal ranges directly, despite themselves being from BT/Rec.2100. In the various BT/Rec standards--at least as far as I can tell--it's assumed that legal ranges are handled in a separate step. This is in contrast to the white papers from e.g. Sony, Canon, etc. describing their log transfer functions, where they explicitly incorporate legal ranges directly into their transfer functions. > The debayering is done different by each manufacturer, ARRI as you said has a very nice and natural noise feel I'm not sure if this is what you're getting at, but the reason noise level subtraction is only (technically) correct in raw doesn't have anything to do with debayering specifically. I gave a brief explanation in my response to Zeke's question in an earlier comment. Having said that, certainly debayering can have an impact on noise estimates, but it's not clear to me whether that impact is good, bad, or just neutral. Although my guess is that (at least for most debayering algorithms) it's close to neutral.

Anonymous

If you mean electrical crosstalk, then no, that's not why. (Or, at least, not the reason I'm aware of.) It's actually for exactly the same reason that human color vision also has a horseshoe-shaped chromaticity diagram: the cones (or photosites, for cameras) have overlapping spectral sensitivities. And that's a good thing! For example, it's what allows our eyes (and cameras) to triangulate the colors of the rainbow. > I should probably just keep my mouth closed and wait for the series to finish :P Ha ha! Please don't! I actually find the comments both informative and motivating. Honestly, I learned quite a bit myself from the comments on the previous video, both in terms of what color topics cause confusion for people and in terms of things that I literally didn't know before about color. And that's going to directly impact future videos, for the better. I may know an unusual amount about color for a regular guy, but I'm no expert. :-)

Anonymous

Hi, thanks for answering. R601 is from the 80s and R709 carries on with it as a recommendation to use. I think I mixed up R.2020 itself with Dolby Vision, that can work in Rec.2020 (but also P3) and recommends processing and delivery in full range. at around 18:45-18:50 you talk about broadcasting log encoded video and that's what PQ and HLG are doing basically, but maybe I jumped to conclusions that that's what you mean :-) That remark about noise on different sensors was generally about the noise "quality" itself rather than handling it, as ARRI lowlight image is generally less noisy than RED due to ARRI going for pixel quality more than quantity as they say themselves. BTW. when you refer to RAW not debayered images, what do you mean exactly? As far as I know all software displays debayered image as you don't have access to separate sensor fields.

Anonymous

> at around 18:45-18:50 you talk about broadcasting log encoded video Oh! Got it. Yeah, that's a total brain fart on my part. I always think of the term "log footage" as referring to footage encoded with the manufacturer's log transfer function--i.e. the kind of log footage that's always supposed to undergo further processing to get a final image. But yeah, you're totally right, PQ and HLG are essentially broadcast log video. I wasn't thinking of that at all. Thanks for the correction! > That remark about noise on different sensors was generally about the noise "quality" itself rather than handling it Ah! I totally misunderstood your point. Sorry about that. Honestly, there's still a lot I don't know about the low-level details of sensor hardware relevant to noise. But yeah, it doesn't surprise me at all that things like photosite size/density would have a significant impact. And certainly, color filter layout and debayering could have an impact on how noise manifests in the debayered image as well. > BTW. when you refer to RAW not debayered images, what do you mean exactly? As far as I know all software displays debayered image as you don't have access to separate sensor fields. The raw files I've dealt with, at least, do give you access to each photosite individually. When viewed naively, it looks like a grayscale image with a faint checkers-esk pattern, because each photosite just gives one signal. Having said that, I've only dealt with raw files for still images up to this point. I assume most raw video formats are similar (certainly CinemaDNG is, at least), but I don't have any hands-on experience with them yet myself. Anyway, my current mental model is that in the journey to becoming RGB, digital camera images go through three basic stages: 1. Raw. This is the raw sensor data, with no debayering. 2. Debayered raw. (This is *not* RGB in the way a lot of people think.) 3. RGB. This is what you get after doing appropriate color transforms from the debayered raw image. Whether we as users have access to stages 1 or 2 depends on the camera and software we use. But they're still there even if hidden behind firmware or whatnot.

Anonymous

> The raw files I've dealt with, at least, do give you access to each photosite individually.... I see, I mostly deal with film/video files. Good to know cinemaDNG works this way, don't think ARRI allows for that kind of access in software (it's probably possible through their SDK), neither does RED. The processing steps you described are exactly like that. Probably checked that already, but all major film camera companies provide sample RAW footage to check out and play with even if you don't have access to cameras themselve

Anonymous

i actually did mean wavelength crosstalk, spectral overlap would have been better to say, and more accurate! thanks Nathan

Anonymous

Ah, got it! In that case, you are super on point (as usual!). Honestly, crosstalk is a fine way to describe it. I just hadn't personally heard the term used much outside of electronics before, hence the confusion. Thanks for clarifying!

Anonymous

Nice follow up article Nathan, I didn't realize that dark frame subtraction was such a dark art (boom tish). On the video level issue I found an interesting abstract that alludes to the headroom above white being for chroma saturation (which shouldn't be there but what evs) quote -"In a 100% saturated video signal, the picture content extends from − 33 to + 133 IRE units, for a total of 166 IRE. For maximum dynamic range, these limits are used to set the zero and full-scale of the A/D." the link is www.sciencedirect.com/topics/engineering/composite-video-signal honestly I think that much of the compression scenarios relate to backward compatibility with analogue equipment and not just CRTs. The A/D converters had lots of timing specific issues to contend with not to mention error correction problems of sample fitting.

Anonymous

Thanks for the kind words! The subtraction I talk about in this video isn't really dark frame subtraction per se. More like... poor man's dark frame subtraction? My (somewhat uninformed) understanding is that proper dark frame subtraction is mostly only useful for long exposures, and doesn't really work the way you'd want with short exposures (like in video). Thanks for the link! I admit that--at least skimming--it all goes over my head. So many things about electrical/radio signal processing are still beyond me. Having said that, it would make sense to me if it did indeed have something to do with the A/D converters (inexpensively) available at the time. My intuition would have been that the A/D converters would need to do some kind of range conversion *anyway*, so why not adjust the range during conversion rather than storing them in digital. But if that wasn't reasonable to do for some reason or another when the first standard was made, that would explain a lot.

Anonymous

Did you know that professional CRT monitors were still a thing until very recently? They are fundamentally an analogue display device with circuits specified decades ago, I expect that the standards were written with that kind of legacy infrastructure in mind. Indeed there would be places around the world still using that technology, with the expectation that the signal paths remain consistent. Oh well, one day they won't be able to get parts any more... shhh don't tell Ian tho', his entire aesthetic is built on ancient electronic technology :D

Anonymous

I nearly Blacked out watching this (sorry for the bad pun). Great presentation for a really dry Topic!

Anonymous

whaaaa thank you very much , can't wait for what's coming later !!

Anonymous

Absolute mad lad Nathan, this is such a good series!

Anonymous

where can I find the third part? I really like the Idea behind Patreon - but finding specific posts is like throwing a dice...good luck with that