Home Artists Posts Import Register

Content

Your camera and Blender are buds, now (or your camera and basically any 3d software).

LUDICROUSLY excited to share this! Thanks to you folks here on the patreon Nathan's been able to hyperfocus on building these tools over the past 3 months- trying to make studio-type worldflow tools for the open source community (we'll release it to the "general public" later but you folks are, as always, our beloved guinea pigs :P)

Get the tools here:
https://github.com/EatTheFuture/image_tools

This is still in alpha, and we're trying to make it work on every system, but that's just all the more opportunity for something unexpected to pop up- let us know if you encounter a bug! Nathan has a lot more he wants to add in the future (features, not bugs), but this is probably a great starting point.

Also- the above video covers a lot of ground, but not nearly as much as I feel it should- but I've been overthinking it for WAY too long so I'm just going to upload it and do damage control instead of put it off for another month hoping I can get it perfect. 

Files

Alpha Release: Color Tools! Match CG/Live Action!

0:00 Introduction 3:30 Why is this Important? 4:20 Stuff You Can Do 6:11 Apologies for Laying It On Thick 6:20 Light is Linear, Your Footage Isn't 6:45 Grading Incorrectly 8:25 New Hotness 10:27 ACES is very nice 10:48 What is a colorspace/gamut? 11:40 IDTs 12:00 Black Levels (In is a Nerd) 13:04 Lasagna 13:11 Downloading the Tools 14:24 Use LUT Maker to Create a Camera Profile/LUT 19:05 What *is* OCIO? 20:35 Bit Depth and Greenscreen 20:20 Using OCIO Maker to Create a Config File 23:05 Setting up an Environment Variable 24:40 Intro to Working With Footage in Blender 26:00 Ian Friggin' Bails Your camera and Blender are buds, now (or your camera and basically any 3d software). LUDICROUSLY excited to share this! Thanks to you folks here on the patreon Nathan's been able to hyperfocus on building these tools over the past 3 months- trying to make studio-type worldflow tools for the open source community (we'll release it to the "general public" later but you folks are, as always, our beloved guinea pigs :P) Get the tools here: https://github.com/EatTheFuture/image_tools This is still in alpha, and we're trying to make it work on every system, but that's just all the more opportunity for something unexpected to pop up- let us know if you encounter a bug! Nathan has a lot more he wants to add in the future, but this is probably a great starting point. Also- the above video covers a lot of ground, but not nearly as much as I feel it should- but I've been overthinking it for WAY too long so I'm just going to upload it and do damage control instead of put it off for another month hoping I can get it perfect.

Comments

Anonymous

Hey Ian, thank you so much for this tool it's helped me loads in compositing. I was wondering if anyone else has switched to 4.0 and doesn't have the agx transform available in the menu. I was wondering if I have to do anything with the image tools folder in order to have agx appear. Thanks!

Anonymous

any update on this ??

Anonymous

New video, new year miracle

Anonymous

Very cool, thank you 👍🏻

Anonymous

OH SHIT

Anonymous

Amazing tool, thank you! ThiS is the kind of stuff they talk about on Corridor that’s missing sometimes even in Hollywood films, honestly! Like with the first Sonic movie, how the green of his eye was totally wrong compared to the footage.

Anonymous

YESSSSSSSS Thank you so much Nathan and Ian!!!!!!

Anonymous

SO happy to see you talking about this stuff! There's so much misinformation and confusing terminology out there, terrible for beginners. But it seems like things are slowly turning around. I remember when I learned proper gamma/gamut managed workflow it felt like leveling up in a game. Started looking at techniques and workflows totally differently.

Anonymous

This is so massive

Anonymous

Holy shit ian this is huge this could genuinly change the indie film industry for the better

Anonymous

I wonder if there's a way to use the photometric data to estimate the sensor's RGB primaries as well as eotf. I imagine it would require some additional gear. I use a blackmagic pocket 4k, and afaik the chromaticities aren't publicly available so any gamut conversion has to be done inside resolve

IanHubert

EXACTLY feels like leveling up! And hahaha yeah I tried hard to not be another source of misinformation (by which I mean, regularly asked Nathan, "Is this right?" and he'd usually be like, "Right enough, unless you want to be REALLY technical" and I'd say "no I don't") but man, it's hard- the terminology is intense.

Anonymous

Hybrid log-gamma would be awesome to see included, since alot of cameras that support HDR can use it in lieu of a proper log format and it works really nice, and personally the GH5's V-Log profiles would be really nice to see too, since its a really popular indie film camera Edit: my bad HLG is already in there! awesome

Anonymous

That'd be cool, yes — I shoot on a Fuji X-T3 which does HLG and F-Log, so putting my hand up for those as well 😊

Anonymous

This is so cool! I noticed that you're just using JPGs here to equate to log footage, at least at the start. Does it matter if you're using JPEGs, raw files, MP4s etc or not because it's all coming out of the same sensor?

Patrick Lever

man you guys are TRUE JEDI KNIGHTS THANKS

IanHubert

Basically yeah, all coming off the same sensor with the same transform on top of it :D. (Also my camera shoots 8bit video, which is the same as a jpg anyways, so I think there's like literally no difference). Although raw files have a different workflow in general, because they're already (usually) linear!

Anonymous

This is amazing! cheers guys

Anonymous

The short answer (that just leads to more questions) is that camera sensors don't actually have RGB primaries, at least not in the sense that they could be mapped 1-to-1 to CIE xy chromaticities. This is one of the things I plan on covering in the video series I'm working on, but cameras have different color vision than people (and each other). So there's never a 1-to-1 "correct" mapping. But the *other* short answer (that leads to fewer questions) is that to do that properly, you need a monochromator to determine the spectral responses of the camera sensor, from which you can then derive at least a reasonable mapping to CIE XYZ color. And the *other* other short answer is: I don't know, but it's one of the things I want to explore. E.g. if someone has a color checker available, can we do something "good enough" with just that and a couple of known lighting setups? I don't know, but I really want to find out!

Anonymous

There's all of blender tutorial content on the internet, and then there's Ian.... o_o

Anonymous

Great video! I learned about this stuff myself just over a year ago, and it completely changed the way I work with colour throughout my entire post-production pipeline. I've tried to explain Linear, and Colour Spaces, and OCIO to other people and I don't think they quite got it, as I wasn't nearly as well spoken as this 😅 hopefully this stuff will become more common knowledge soon! I ran into that Blender values over 1.0 issue on a project this year, and it confused the hell out of me, because I expected my ACES2065-1 EXR files to just apply as HDR textures in Blender, and it seemed to clamp down 🤔 I'll have to try the trick you mention towards the end of this video! The LUT Maker tool seems super useful! I'll definitely have to give that a go too 😁 I really appreciate the content! Thanks for sharing!

Anonymous

If you ever need to explain the benefit of linear workflow to someone, just show them a gamma managed bokeh shape, or high intensity motion blur streak vs display referred ones. Or just the classic old red-green gradient.

Anonymous

Happy New Year in advance, thank you very much for this new addon which for me is a dream come true, I always had to struggle to find the right settings to match my CG shoots with the live footage.I shoot with a black magic pocket cinema camera 4k with the Blackmagic RAW color profile, could you include it in this wonderful tool developed by Nathan. Thank you in advance and I'm really looking forward to episode 2 of Dynamo Dream. My respectful greetings

Anonymous

Great stuff Ian (and Nate)! It reminds me of the time I discovered the "Color Space Transform" tool in DaVinci Resolve, so I could go from log3 to rec709, do the color correction, go back to log 3 and then apply a kodak film LUT on top of it, but this is way cooler!

Anonymous

This is incredible. ALSO I can't wait for you to dive into that reprojected HDRI/Shadow catcher magic!

Anonymous

I'm starting a project very soon and I NEEDED this although I didn't know I did need it. Great work, guys, and thanks a million.

Anonymous

Awesome! Almost makes me want to go back and redo the 1500 completed VFX shots on my movie. LOL

Anonymous

What would be the optimal Tranfer Curve for Iphones? (specifically iphone 12 pro max for me) I am asumming alot of people are using iphones on here, so it would be great to have that as an option :D

Anonymous

Delighting nodes.... Groooove is in the heeeeeaaaaaarrrrttttt!!!!!

Anonymous

Really interesteing video! What music is that at the beginning?

Anonymous

Help! I've got a Canon camera and LUT maker program won't recognize my raw (CR3) file format!

Anonymous

This is so friggin cool! So it seems like you should setup a profile for every ISO option you have in your camera for maximum accuracy?

Anonymous

Yeah, I'm curious about this too. I'm using an iPhone 13 and it has the ability to shoot in Pro Res 422. I use an app called Filmic and I don't think it will let you take pictures to upload into the lut maker.

Anonymous

Is there a specific advantage to this if working from raw footage (I typically only shoot raw or BlackmagicRaw which is quasi-raw. Also, wouldn't generating a CST make more sense than a 1D LUT? This is all very intriguing, but coming from a color grading background it also feels like it could potentially lead to unwanted issues down the rest of the post-production pipeline, depending on what that is. I think this definitely benefits people who tend to shoot in 8bit compressed footage with pre-baked color profiles, and who tend to render in PNG or (even worse) mp4; but I am curious if this workflow that you guys are creating would also benefit those of us who tend to work with raw footage (appropriately converted to rec709 with a CST) and who tend to then render in 32bit EXR. Very intriguing stuff for sure, and I will be the first to admit that currently the ACES and OCIO pipelines are decisively not user friendly at all (and this goes for pretty much all of the DCC's I've worked on).

Anonymous

I'll be covering this in part 2 of the upcoming video series (part 2 is still a ways off yet, but part 1 will be out soon!). But the short(ish) answer is that all of the following things can affect sensor noise: 1. ISO/gain. 2. Shutter speed. 3. The actual physical temperature of your camera. 4. And, importantly, if you're shooting in RGB (which you are if you're using LUT Maker since it doesn't support raw) then white balance, color gamut, and transfer function (e.g. slog2 vs slog3 etc.) will also all affect how that noise is recorded in RGB, and thus also need a different LUT. (Notably, as far as I know f-stop does *not* affect sensor noise, at least not the kind of noise that we need to compensate for in terms of black levels.) So for *maximum* accuracy, you would pretty much have to shoot lens cap images and generate a new LUT for every shot you take. However, in practice that's almost certainly not necessary, because different factors affect noise to different extents, and you have some wiggle room within each one. At the *very least* you should generate a different LUT for different transfer functions and different gamuts. But beyond that, Ian and I are actually still figuring out how much we have to care about each factor too! So we don't have any great advice about this yet. How important different factors are also probably varies between cameras to some extent. So the only real advice we have right now is: do some tests with your own camera to see what factors affect sensor noise enough for you to care.

Anonymous

If you're working in raw, then none of these tools will be useful to you yet, unfortunately. We're initially targeting more of the "indie" use-case (since that's also our use-case!), and a lot of people (ourselves included) don't have equipment that can shoot raw video. As for 1D LUT vs a full color space transform: at the moment, LUT Maker only generates LUTs for linearizing your footage (i.e. transfer function + black levels). To get a full color space transform you need to then bring that into OCIO Maker and also select the gamut/chromaticities. Then the generated OCIO configuration will contain a corresponding color space transform that does both linearization and the gamut transform. If you're working in Davinci Resolve, then probably you only care about the linearizing LUT anyway, since you can easily do the gamut transform with a separate node. Having said all of that, chroma is... complicated, and doing a simple gamut transform is just the barest of basics, and is not enough in many cases. So, admittedly, the current offerings there are pretty anemic! And that's simply because we've been focusing first on helping people get into linear color, since that's both more critical and far easier than handling chroma. We are hoping to address chroma as well, but we're not there yet.

Anonymous

The tools don't support raw yet, unfortunately. If you're shooting in raw they're also probably less useful to you anyway...? I think the only thing so far that might be genuinely useful for a raw workflow is estimating the noise floor/black levels, which you can do after processing your raw images into an RGB color space. But even then, we'd still need to properly support higher bit-depth file formats first, which we don't currently (although it's on the to-do).

Anonymous

THIS IS AWESOME!!!!1!11! BBQ!

Anonymous

Can this be implemented for samsung phone camera ?

Anonymous

That's awesome ! It's one of the topics where I have hard time finding reliable information : either it's one dude proposing stuff in his garage, or highly theoretical discussions that don't actually help when it comes to open blender/resolve and try to do stuff ;) Still have a few questions because it's a slightly confusing subject for me to be honest ! (maybe it's covered in one of your videos ;)) 1) Okay so everything went amazingly, we linearized our stuff which now "resides" in blender's internal memory. But if I understand correctly, the internal color space that blender is using is in fact the one you chose when selecting "Chromaticities / Gamut" in the OCIO config, correct ? Blender render engine just doesn't care about which one it is exactly because everything's linear anyway ? 2) If you want to combine two shots, intuitively I would have said "no problem as long as in the methodology you describe you choose the same gamut in the OCIO config at the end". But that doesn't seems to be the case with what you describe at 21:00 where you choose another smaller gamut for green screen. How does it work, then, to be able to mix those two footages that are not expressed in the same gamut ? 3) I'm a little bit at a loss when it comes to choosing a display device. In most cases, choosing the invert display (option at 22:00) in the display device of blender doesn't make sense because anyway most non-professional monitors can just output sRGB ? And another problem : if I want to have the same output color space than the one I have in Resolve, assuming I'm targeting the output device for let's say Youtube 4k, what display device and transform should I choose in blender and how do I make sure that I have the same in Resolve, to ensure that the choices that I make when building my CG elements will roughly appear the same by default in Resolve ? srgb standard is "flat", filmic is quite nice but it's hard to find the proper corresponding setup in resolve ... 4) Fews additional questions about ACES ;) Visiblay ACES has drawbacks (huge problems with posterization due to gamut mapping, weird hue skews ...) : why is that ? Is it because the gamut of the output device is smaller than what ACES can handle so there is a "downsampling" of the values that create problems ?

Anonymous

For LUT MAKER, could Nathan add Adobe RBG? I got a Canon Rebel T8I and just realize it sucks for color space. All it has is sRBG and Adobe RBG. Adobe RBG has a bit more color and would love to have it in the transfer curve for LUT MAKER. Thanks Ian.

Anonymous

This is awesome. Can you show how you would use this workflow to composite blender renders done as Open EXR Multilayer with your footage in a program like Ae? I have hundreds of shots in progress in Ae using BMPCC 4K green screen footage with Open EXR Blender renders using color io and would love to be able to stop having to eyeball every shot. :)

Anonymous

Fascinating! I didn't understand a word of it!

Anonymous

> But if I understand correctly, the internal color space that blender is using is in fact the one you chose when selecting "Chromaticities / Gamut" in the OCIO config, correct? Ah, that isn't correct actually. Blender by default uses linear Rec.709/sRGB as its working/rendering color space. And to be compatible with Blender's defaults, OCIO Maker generates configs that also use that. In the future, OCIO Maker will be more flexible, and allow generating configs with e.g. the ACES AP1 as the working color space. But for a first release we're trying to keep things pretty stripped down. The "Chromaticities / Gamut" setting in OCIO Maker is specifying what color space you were *shooting in*, so OCIO Maker can then properly convert *from* that color space to the working color space for you. > Blender render engine just doesn't care about which one it is exactly because everything's linear anyway? Ah, this is something we maybe could have been clearer about in the video. Whether your color is linear or not and what gamut your colors are in are two completely separate things. For example, usually the ACES AP1 color gamut is used with linear color, but you can also have nonlinear color in ACES AP1 and doing so doesn't change the gamut. LUT Maker and the "color curves" stuff *only* deals with linearizing color, not changing the color gamut. For handling color gamut conversion, you need to use OCIO Maker's Chromaticities/Gamut setting. And rendering in different color spaces / gamuts definitely changes the results. So Blender does care about that. Again, for now we're sticking to linear Rec.709/sRGB to be compatible with Blender's defaults. > 2) If you want to combine two shots, intuitively I would have said "no problem as long as in the methodology you describe you choose the same gamut in the OCIO config at the end". As mentioned above, when you select Chromaticities/Gamut in OCIO Maker it's telling OCIO Maker what gamut you were *shooting in*. It then uses that information to automatically convert all footage to the same gamut. This doesn't mean your colors from different gamuts will match exactly (for reasons I'll be going into eventually in the video series I'm making), but they should at least be in the same ballpark after conversion. > 3) I'm a little bit at a loss when it comes to choosing a display device. For actually doing work, you definitely want the selected display device to match your actual monitor. Absolutely. So yes, usually sRGB. We allow including your input device color spaces in the Display Device list mainly as a kind of hack. Doing so allows a couple of things: 1. It's fun! 2. Blender doesn't provide a proper way to specify the color space that images should be *saved with*. Until Blender does, this is a hack to let you do that (just set your Display Device to the desired color space before doing anything that saves renders to disk). But indeed, this should actually be a separate setting in Blender. > if I want to have the same output color space than the one I have in Resolve, assuming I'm targeting the output device for let's say Youtube 4k, what display device and transform should I choose in blender and how do I make sure that I have the same in Resolve If you're going to be loading things into Resolve anyway, then you can render out to pretty much anything you want, and just convert from that to whatever else you want within Resolve. > Is it because the gamut of the output device is smaller than what ACES can handle so there is a "downsampling" of the values that create problems? Moving from a larger gamut to a smaller gamut is always a lossy operation, because you will always lose the colors that the smaller gamut can't represent. It's actually pretty similar to trying to display HDR footage on an SDR display: you don't want to just "clip" the colors, you want some kind of reasonable roll-off. But in this case it's with chroma instead of luminance. So if the conversion from a larger gamut to a smaller gamut isn't done carefully, it can lead to visually objectionable results, such as color shifts etc. (Again, it's *always* going to be lossy, but it can be lossy in a "nicer" way or an "ugly" way.) (Notably, OCIO Maker does *not* do this carefully right now, so you may encounter that issue with our tools as well, depending on your source footage. This is something I want to address in a future release.) I don't know off the top of my head if this specifically is the reason for the issues that you're describing with ACES. But it's certainly one potential source of issues.

Anonymous

aaaaahhhhh how did you project your HDRI onto your lidar scan? trying this right now

Anonymous

Adobe RGB is already in OCIO Maker! :-) LUT Maker only handles linearizing color, not color gamuts, which is why it's not there.

IanHubert

I'll do a video about it later, but I set up the HDRI as the image texture (with a Spherical projection), and for the texture coordinates I used "Object", and selected an empty :D The only tricky bit is lining up the empty with the exact location you took the HDRI from.

Anonymous

Thank you Ian! this is awesome stuff... I love your "let me show you how it MAY be done" approach, instead of the usual "I'll tell you WHY you CAN'T do it" that many so-called-experts have. Some time ago I got into a discussion about ACES in devtalk, about the proper way to handle blender footage in a color managed environment, and got away receiving a lot of insults and name calling, but still no useful answer by a very well known-and-equally-short-tempered expert. I really, really appreciate this :-) Many thumbs up!

Anonymous

Welcome to the wonderful world of color science ;P In no other community have I ever seen so much brazen vitriol spat over such specific minutiae

Jostein Lie Svalheim

"It looks desaturated, but that's just the world i live in" Relatable

Manuel Grewer

Color science ... fascinating stuff! Great video. You always have a way to find simple workflows

Anonymous

very informative video, both this one and the one from you, Nathan! Thanks a lot! All the app-images are segfaulting in my arch linux setup - I'll debug a bit and create an issue or such

Anonymous

Hi Nathan! Thank you for this awesome app! I have just one issue, I couldn't make it work on mac. I can open the app, load images and estimate the curve, but when I hit "export to linear LUT", it crashes silently. And in the terminal it throws this message: " thread 'main' panicked at 'Attempted to construct an Id from a null pointer', /Users/runner/.cargo/registry/src/github.com-1ecc6299db9ec823/objc_id-0.1.1/src/id.rs:62:9 note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace Saving session...completed. " (Just for the record, I have no /Users/runner/.. folder on my machine, so maybe that's hardcoded somehere... IDK). And, BTW I'm on big sur. Thanks again, and really looking forward to test this development!

Anonymous

This is so cool, thanks for this! I haven't watched the later Patreon videos yet, but just wanted to comment that I used to be "that color guy" at work in the past. :D Not for a VFX company, but I did a lot of camera measurements, spectral analysis, color science and whatnot. Color science is especially super interesting, and at the same time very misunderstood by most people.

Anonymous

Awesome stuff! Is there any way to use a LUT as the view transform in Blender? Slog2 instead of filmic in your case. Also noticed you are rocking the Sony A7iii :) Have you tried HLG3 instead of Slog2 with this workflow?

Anonymous

Ah! Sorry about that! I haven't tested the appimages across many distros, so it makes sense that they may have problems. :-/ If you figure anything out, I'd definitely like to hear about it! Thanks!

Anonymous

For anyone else running across this, we've tracked down and fixed the bug. Any of the latest dev builds include the fix, and of course the next release (whenever we make one) will include it as well.

Anonymous

> Is there any way to use a LUT as the view transform in Blender? Yup! That's what the "Include as Display" checkbox does. Having said that, there is unfortunately a bug in Blender right now that makes the transform not work properly: https://developer.blender.org/T94001 But once that's fixed, it should work correctly. And sometimes it works for Ian anyway, so your mileage may vary.

Anonymous

As for HLG3, is there a reason to use it over actual real HLG? I confess to being annoyed with Sony for even creating their own custom proprietary hacked up HLG variants (which is what their HGL1-HLG3 curves are) in the first place, because it just makes things more confusing for people. HLG is supposed to be a single standard, and I'd have preferred it if they'd just stuck to it. Edit: Ah, sorry, but to answer your question, we haven't played around with HLG, no. We should definitely give that a try. We can't do Sony's HLG1-3 variants because (at least as far as I've been able to determine) they've never published the formulas for them, so there's no way for me to implement them in LUT Maker. But we do have actual HLG in LUT Maker.

Anonymous

Seems like Sony is trying to balance dynamic range vs noise in the HLG1-3 variants. I'm using HLG3 purely because it gives very nice results, easy to expose and grade. Very excited about this OCIO workflow with Blender. Keep up the awesome work! Can't wait until the next video :) About LUT maker I'm quite sure that HLG3 is color space Rec.2020 and gamut Rec.2100 HLG. Both of them are included in your app! Cheers Rob

Anonymous

I don't have a problem with them creating new curves to try to get better results. But calling them HLG is really misleading, because it's the name of a standardized curve. Which causes confusion. > I'm quite sure that HLG3 is [...] Rec.2100 HLG. This is exactly my complaint: it's not. Rec.2100 HLG and HLG3 are different transfer functions. They're very *similar*, but if you decode HLG3 footage with the HLG transfer function the results aren't *quite* correct. And by calling it HLG3 Sony has given the impression that it maybe somehow conforms to Rec.2100 HLG (it doesn't). And this is causing all kinds of confusion for people. It would have been better if they'd called it "Sony Log-Gamma 3 / SLG3" or something like that, to make the distinction clear. (Btw, I'm not meaning to pick on you in any way. Any annoyance that may be coming through in my comments is squarely directed at Sony, not you.)

Anonymous

Haha no worries :) Good to know that HLG3 is shady business from Sony's side.

Anonymous

Cool tools man, coming from a color pipeline, it would be awesome if Blender could have a node or color management like Resolve where you can use a node called "Color Space Transform" in which you can move from one color space to another without a sweat.

Anonymous

I 100% agree. I would love to see that, both in the shader nodes and compositing nodes.

Anonymous

If you're curious about details of Sony's HLG nonsense, this article covers it quite thoroughly: https://xtremestuff.net/sony-and-hybrid-log-gamma-hlg/ It's also worth noting that Sony cameras also come with a standard HLG profile (no number tagged onto the end), which in theory should be the standard curve, and indeed gives slightly different results than HLG3. Although the deeper I'm diving into Sony's stuff, the less and less I trust them to get these things right.

Anonymous

Before I tried the 0.2.2 release, now the 0.3 release - same thing unfortunately. But compiling the src myself worked fine :-) I tried debugging it a bit right now: running one of the appimages like this: ./lut_maker.AppImage --appimage-extract yields a squashfs-root/ folder. If I cd into squashfs-root/usr/bin and do a ./bin there the program works. If I try running the squashfs-root/AppRun program - that segfaults. ldd'ing that is fine, as is the ./bin program. I'm puzzled... I just opened an issue on the github page - seems more relevant there than in this thread here ;-)

Anonymous

Thanks for opening the issue! I started investigating today. Hopefully I'll have a fix in not too long. But I'm glad you were able to build it yourself! I should add build instructions to the readme.

Anonymous

Does anyone know how to use this in Davinci Resolve?

Phil South

I love this stuff that’s a little bit over my head. It takes a few views for it to start making sense but i do get to a point where I think huh yeah I might have to accept some upper levels of compositing perfection might be above my pay grade, so to speak. I’ll try to keep up but it’s kinda scary up here on this ledge. :)

Anonymous

Hey guys! Thanks for this! I seem to be running into a problem though, I followed the instructions but no new anything is showing up in my Color Management tab or the color space tab in Blender. I even did a restart after adding the Environmental variable. It shows up in the User Variables but not in Blender?

Anonymous

Got it!! I just made the name I saved it under shorter and it worked! Weird...

Anonymous

Also, just fyi, I have an issue where if I start Blender from my stream deck, none of my new color spaces show up. But if I start Blender from the shortcut on my desktop, they come up. Both links point to the same shortcut, but for some reason the stream deck link is weird...

Anonymous

I know these tutorials are a lot of work and take a while, but I'm waiting on bated breath for the next one! Thank you guys for doing this.

Ben

This is amazing! Thank you so much for making this. What transfer curve would I use to turn LogC Arri Alexa footage into linear? I can't do the estimate thing because AFAIK the Alexa can't take stills.

Anonymous

I'm extremely excited by the whole idea of really getting good defaults from my pipeline rather than fumbling in the dark... So far though I'm getting strange results. I shoot on a Canon R5 in its Log3 format with the CinemaGamut profile. The LUT I got from lut_maker seems fairly darker than the default Canon LUT. When I used lut_maker I selected Canon Log3. I recall reading there are 2 "Log3" formats, an old one and the latest used by the R5. Not sure which one is in the presets. I then tried to calibrate my Theta SC2, but got something pretty far off from what the R5 produced... So, I'm back to fumbling in the dark until Part 2 I guess :)

Anonymous

What to do for a smartphone cameras

Anonymous

I get ERROR "Estimate sensor noise floor" panicked and " Error: "estimate sensor ceiling" panicked" with both my iphone and my canon EOS600d! What do i do? Also, on import i get the warning " WARNING: Image file lacks Exif data needed to compute exposure value: "D:\BLENDER 2022\Colorthings\Picstouse\IMG_5043.png". Transfer function estimation will not work correctly."

Raf Stahelin

Hi Nathan, I shoot mostly Red Komodo which is one of the indie standards here in London, besides the Alexa Mini. Wonder if there would be a chance to get a profile for the Komodo? Thanks!

Anonymous

Has anyone here used these tools on MacOS Catalina, and if so, how do you set an environmental variable? I've tried at least five different options, and none of them have worked so far.

Anonymous

OMG - ETF tools are such a treasure! Are there more videos on the topic? I am totally lost on patreon....

Anonymous

I'm having trouble setting this up on macOS Monterey, wondering if any one else has done it

Phil South

Oh but what if your DSLR doesn't output LOG?

Phil South

I just hunted around and you can add C-log to my crappy Canon DSLR, so now I can get me some of that sweet linear goodness.

Anonymous

Hey can you guys give me any hints with the Blackmagic .DNG thing? I just spent all morning trying to make a LUT with the LUT Maker and I can't say I have gotten anywhere. I used Davinci to export a .JPG and now it says 'WARNING: Image file lacks Exif data needed to compute exposure value:' Is there some magic trick to working with .DNG files? I thought I heard you guys say you were shooting on a Blackmagic for some shots...

Anonymous

Nothing so far. the only stills my Blackmagic takes are .dng. The only jpg.s I get out of it say it lacks the Exif. It's a no fun circle....

Anonymous

Im not the best with color, but could you export it in an .exr?