Home Artists Posts Import Register

Content

Just some stuff I've been thinking about! I have all these shoots I want to do, but we're still probably a ways off until I can do a proper one (Covid-wise).

I dig the potential, though. In some capacities I think it could totally work (in terms of projecting different views on different places, rendered in realtime in eevee), but it still might be trickier than I think (ex: I can use a white wall as a projection surface, but then I might not get great contrast??)

Still though, I think I might play around with trying to make some medium-level detail city streets, and do some tests!

Also- hey... do you guys remember when you sent me all of that black video?? Because I almost forgot. But I just remembered. 

Files

Hairbrained Virtual Production Schemes

Comments

Anonymous

Exciting times! I just finished a test shoot with LED walls and Unreal in Sydney, but can't wait to do it in Blender!

Anonymous

AHHH that's going to be such a cool shoot. Dreamstate, that's a way to output multiple cameras to projectors live from blender? Is that built into blender? also *pew

Anonymous

Also reacting to the ending bit about spraying water on windows. VIRTUAL PRODUCTION TAXI DRIVERRRRRR.

IanHubert

If it's in any way cyberpunk I think the Bladerunner Commandments dictate you HAVE to spray at least a little water on the windows. And yeah! You just set the 3d viewports as detached panels, and you can drag them wherever you want! Just set up the projector as another monitor, and you can just put it on there!

Anonymous

Ah so cool and it's100% a Bladerunner commandment.... that "thou shalt spray thy car window, once in washing the car, and thrice in making a cinematic motion picture" - Ridley Scott

Anonymous

ha ha cool hope you didn't forget the green screen elevator scene tutorial.. Very Exciting

Anonymous

Cool stuff! Also, it seems you finally got a tablet! Are you enjoying this revolutionary new way of controlling a computer?..

Anonymous

As long as you can balance your exposures and control the light spill, should be able to get a decent result. Might need to keep keep the key/fill ratios pretty low contrast on your actors to match the projection, then boost in post? I guess that's why the LCD screens work well. We always used to do a lot of rear projection stuff with big 20k Barcos and Christies - but that was before this modern realtime vr / virtual production.

Anonymous

Thanks for those creative ideas, thinking of trying out something like that myself.

Anonymous

Will you the use a live stream out of Blender for the Projection or will it be a pre-rendered movie? Could you even do a live "Stream" out of Blender? Nice shirt btw. ...

IanHubert

My plan would be doing the whole thing live! Which means in theory I could actually sync up events and all that, live (HECK- I think I could figure out how to do traffic lights passing overhead, maybe). If for no other reason than it'd be slightly easier to start/stop/keep in sync (and change angles/focal length on the fly). For the stream I'd just have detached fullscreen viewports on every projector, set up as monitors :D

IanHubert

I've been borrowing it from Sean! It's... honestly whenever I take it on, it doesn't let me move the mouse to other screens?? Probably just a driver issue. BUT YEAH! The pressure sensitivity and all that is rad! I always think, "if I only had a tablet I would animate all this fun 2d stuff!" and then I get a tablet and it turns out I don't really do that, hahaha

Anonymous

Wacom has settings for what screen you want to control in the Wacom Tablet Properties. I've also had some issues in the past within Blender. Maybe experiment with the settings in there aswell. As always: fiddle until perfect..

Anonymous

You are reading my mind. If you are looking for real-time camera tracking there is a guy that has a Vive tracker add-on for blender. I think a better way would be to use blender to control a motorized camera rig then the tracking should always be spot-on. I've been thinking of doing something like this. Virtual production like the Mandalorian but led walls are expensive so I was thinking of projectors like Oblivion.

Anonymous

You have to rent a gigant screen Tv like the stadium or street advertise and make the BG Mandalorian Style.... with Eevee and real time... out of focus and ready to go, free reflection and rotoscopy. With the fly mode you and drive in your own camera in your 3D city

Anonymous

I think, because the tablet is a proxy for the screen space instead of "pointer mover", trying to go between screens doesn't quite fit the paradigm. It certainly doesn't 100% replace the mouse in everything. Next you'll need to start dual-wielding a Space Mouse in the other hand and you'll be the King of Cyberpunk.

Anonymous

Super weird, do you mean screens on the same monitor? Or different monitor in general? I almost gave up on tablets until I saw a random reddit post that explained how he handled multiple monitors and I LOVED It. Now I solely use a tablet when using blender (helps with my wrist pain a TON). He explained: go into the wacom tablet properties, go to "Tool: Functions" section, open up express keys and assign one of the buttons on the physical tablet to be "Display Toggle" and then set in the display toggle section (in my version which I think is old, its another flap two over from express keys) the correct check marks to make it where when you press that assigned button, it flips control from one monitor to another. Those settings seem to change once every two weeks, I have no idea why, but it takes about 5 seconds of testing to figure out the right combo when it randomly stops working. That way the surface doesn't cover the entire spectrum of monitors (I use 2-3 at a time), which I personally found to be waaaay too big of a surface area to cover. It just switches control from monitor to monitor. If I totally misunderstood what your problem was, WHOOPS, sorry yall, hope this helped someone with the problem that I was facing!

Anonymous

I think your idea is great if it can work real time. I have done this in Unreal Engine, using just a few large LED panels from a Chinese company, with a 18 MM pixel spacing- works well. I do camera tracking my an Optitrack rig I have. Would love to see this all done in Blender...

Anonymous

Sorry- should have said, "1.8 MM pixel spacing"... NOT 18 MM.

Anonymous

Hi Ian! When will there be a lesson on creating a landscape? (desert, forest, mountains)

Anonymous

funny you should say this. I just recently started redesigning a film I was planning, to use on-set projection screens instead of green-screen. So the virtual sets can be shot in production. Saving lots of post production time. (Which also means that I would have more time in post to do things OTHER than just adding and tweaking backgrounds. I got the idea from how the mandalorian was shot using unreal engine.

Anonymous

They also used the screens surrounding the set to do the lighting. Cool stuff.

Anonymous

Having worked on Mandalorian vfx and seen their original plates, I will highly highly recommend investing in high intensity led screens. If you get them bright enough you literally don't need to light the scene as your cg environment does that for you. That why the vfx looks so strikingly realistic because not only his real life metallic armor and helmet were reflecting the cg ue4 desert but also creating subtle diffuse shadows instead of harsh shadows you usually get from individual led lights. If you had seen the on set photos of their shoot, all those screens look blown out but when looking through the shot camera at the right exposure, everything looks perfect. The completely eliminate the need to do compositing for simple vfx greenscreen bg replacement.

IanHubert

Oh man yeah, I'm already sold! The Mandalorian setup is the absolute dream! I just figured the cost would be pretty darn wild (my assumption is their setup cost, like, a hundred thousand dollars minimum- maybe I'm way overestimating?) and that a projector-based setup might be the next best thing? I love the power that greenscreen gives you, but I hate what it does from a cinematic perspective (more shooting "elements" instead of "shots", at times). I'm super curious to explore anything that might bring those two closer.

IanHubert

Oh yeah??? Thanks for the info!! I always assumed that would be astronomically expensive (like... $50,000+) for anything at all decently sized- I'm looking into it, though! And huh! Optitrack! That's a word that yields some GREAT google results, thanks! Looks like that's a pretty standard way to get motion data into a computer live. (and okay now I'm watching through Workman's videos- a Vive controller setup seems like a darn solid way to go, too. Checking to see how easy it is to stream that data into blender- seems like it's possible!) Dang. This is all super exciting. Thanks again, Gary!

IanHubert

OH YEAH?? I was literally JUST googling for that like 15 seconds ago, haha! A motion control rig would be awesome, but if I can get the tracking streamed live into blender from a controller, that seems like a great solution! Honestly yeah, if I can get the vive controller data into blender, then... then the whole thing is basically ready to go. This is amazing! (Also- man. Oblivion is still so absolutely gorgeous. It still feels ahead of its time)

IanHubert

Yeah!!! Which would be AMAZING- I'm hoping I can get stuff just loosely working with projectors, first- I think I just have to figure out how to get camera motion data to stream live into blender, which folks here are saying isn't as hard as I might think :D

Anonymous

VIVE is good- but does have quite a bit of 'jitter' that seems pretty tough to get rid of and very noticeable on fairly close shots where you see the live/+virtual. I purchased my Optitrack cameras/hardware used- look at the market on their forum- always guys selling systems cheap! You can use the least expensive Optitrack software if you don't need MOCAP- works very, very well- extremely stable and accurate. Video tiles- as I said- 1.8 pixel spacing and just enought o have some in a scene, failry close...

Anonymous

Hi- Indy- would love to know more about about you did on the Mandalorian set...and what your 'indy' plans are....

Anonymous

@Ian You would surprised at how "relatively" cheap the led walls are if you get them from china. Here's an example (I just looked this up, don't know about the quality or the company but this could give you a good ball park figure of the cost) https://www.alibaba.com/product-detail/Outdoor-Led-Display-Panels-P5-P10_62030055163.html?spm=a2700.7724857.normalList.58.36af400c3DdAiN This is a 960mm x 960mm panel at $318 USD. This is for outdoor advertising and hence is ridiculously bright at 7500 cd/sqm. The one they used in Mandalorian was Roe BlackPearl BP2 screens which are more high quality for sure but meant for indoor use. They peak around 1500 cd/sqm which is still ridiculously bright like you can't look at it directly without blinding yourself. Hence even the example model I gave from china at 7500 is crazy overkill as it is meant for outdoor advertising billboard use and you might find even cheaper models than at peak at 1500 cd/sqm. Now you take this 3ftx3ft model for $300 bucks and stack em to make a 9ftx6ft wall and that would only set you back $1908 USD. Make two more such walls and you have a 3 walled room for $5724. So you can guess, adding additional equipment cost you would need to run all these guys or cost for another wall and a ceiling projector or shipping costs and taxes, within $10,000 you can get yourself your own Mandalorian set. Now you might get happy here but I will pop your bubble lol. The pixel pitch of this model for example (the distance between the center of a pixel to the next pixel) is 10mm as it is meant for outdoor billboards. In order to find the visual acuity distance (distance away from the screen you gotta be before you see the pixelation of any kind) all you have to do is multiply your pixel pitch by 10 and that would be the answer in feet "roughly". So for 10mm pixel pitch screen, you have to be around 100 feet away or you will see the "dots" on the screen. Lol. Film 100 feet away from your LED wall. For reference, your high grade SONY 4k tv has pixel pitch of 0.314 mm so you can be 3 feet away from the TV before you would see pixelation. Anywho so the lower the pixel pitch, the closer your camera can be to the screen before it sees strange moire patterns (you might notice them when you try to take a photo of you monitor with your phone camera).And making screens that are lower pixel pitch means expensive high grade materials that increases the cost of the panel. The screen they used on Mando had pixel pitch of 2.8mm so they filmed an average 20 feet away from the screen. More so you can reduce the distance slightly by using defocused backgrounds. The more defocused the bg is, the less distinguishable one pixel is from its neighboring pixel. And then lens choices affect the amount of moire patterns you would get as well. @Gary Hey! I wasn't on the set. I work in a visual effects studio and we did some vfx for Mando. We get to see original plates and on set photos and witness camera videos that blew my mind the first time I saw this tech. It is very strange to see this LED wall they had from a witness camera because the perspective looks all janky and weird looking but from the main camera it looks flawless composite. Haha, I don't have a space big enough as Ian's to make my own indie setup but if given the chance I definitely think virtual production is the way to go moving forward.

Kai Christensen

Unrelated, but I just found out in the Import Images As Planes addon, you can set it up to import it at a scale so it fills the camera view. You know, just in case for some reason someone here found that useful!

IanHubert

AH! This immediately inspired me to jump into researching this stuff and I forgot to respond here!! This is a ton of great info! Thanks so much for all of this, seriously. A lot of times I'm totally okay with my background being out of focus, so I'm wondering how much I can split the difference pitch-wise. I've been thinking about this nonstop for a couple days, haha! Probably just sticking with projections right now (going to see if there are such things as "high contrast projectors"? Getting that black level dark seems to be really important. But whatever system I end up with, it probably wouldn't be too hard to swap out the projector with an LED screen down the road, since they'd both be taking same input. But yeah man, this is gold- thanks again :D

Kathleen Judge

you probably already know Matt Workman aka 'Cinematography Database' on youtube- but if not, he's a person worth checking out per virtual production.

Anonymous

The end had me rolling.

Anonymous

I was under the impression the screens they used in Mandalorian were coated with a specialty reflective dust like quartz? Then used high resolution projectors to populate the screens...

Anonymous

Hey Ian - i’m using an iPhone and a little program I made inside TouchDesigner to stream camera location data real-time into blender via OSC. It works extremely well. And a lot cheaper than a VIVE setup. Am also using a projector and eevee viewport. Very exciting stuff! Next step - strap the iPhone to a camera and start shooting! I’d love to chat about it - and can send you some of my programming and setup for iPhone tracking to get you started if interested! @tylerrdillman on Instagram

Anonymous

Hey Tyler, I thought about doing similar, but it seems you are ahead of me! Any chance to share your setup? BTW, I'm also based in Toronto as well.

IanHubert

Oh dude that sounds awesome! I'd love to see what you come up with! Also- add this to the reasons I need to get an iphone. ALL the new tracking stuff seems to involve an iphone these days.

Anonymous

@Rafael Perez - sure, I'd love to show you my setup so far. Message me on insta!