Announcement

Collapse
No announcement yet.

V-Ray/Mari HDR Ptex Lighting workflow videos.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Aquazum View Post
    very interesting video. inspires a lot to test that workflow... I'd be also interested in the hardware configuration, used for that video. Somewhere you told, you're on a MacPro with two gtx580? Thanks for any info.
    Here's everything you need to know:

    http://forums.macrumors.com/showthre...1148259&page=1

    Comment


    • #17
      Originally posted by kweechy View Post
      Would it be much easier to set up things like this with the Nuke -> Mari bridge in 6.3v4 now? It seemed like most of your time was eaten up by just lining up the Mari paint-through images, with the bridge though (assuming the geo lines up to the camera shot) it'll import a camera from Nuke to Mari and set up the projection for you to begin a paint-through.
      Mari bridge is useful if you have a 3D comp already setup in Nuke and you want to go back and forth. There are actually other ways to lineup the camera. Getting a Red Epic camera that will shot HDR's per frame, then tracking that footage would be the best. In the end you need to work with Quads only, so there will be some work involved. I actually just got done shooting something 5 times bigger than the apartment and it went a lot better. We used a Lecia survey head to gather 3D points, then model accurately from that. I then moved around with the camera and laptop grabbing 22 stops of range with over 1,500 exposures. I have another crew that is lining up the cameras and building geo as they go that's more accurate than what I did in those videos. To be honest, 2 days to do lineup of camera and model isn't that bad. I actually renamed my HDR image sequnce from 1-300 so every frame is a new camera position.


      Originally posted by kweechy View Post

      Is the Nuke file generated entirely by hand or is there some scripting going on to do things like assign file names to the write nodes and that type of thing? Just feels like it would get very tedious to do something like this with 50+ images if you wanted to generate environments for several different shots...though I suppose that even if your .nk isn't set up like that, some smart coder at a big studio could easily get it automated for pipeline work.
      The nuke file is by hand, but because I shot much more range I didn't need to adjust.. Also the whitebalance can be adjusted for all images using Photomatrix when in batch mode.. This process is really quick. I actually run Batch bracket merge in Photomatrix to the whole directory, then I rename everything so it goes hdrphoto.001.exr etc.. I then load that sequence into Nuke and do the undistort, and write it out again as undistort_hdr_001.exr, and then I do a tone mapped version where I keyframe the exposure in nuke and write out a jpg sequence that's used to lineup..

      I don't actually know about the simple mode with skylight portal.. I haven't used it. Perhaps Vlado or someone else can explain it.

      Comment


      • #18
        Originally posted by Metzger View Post
        Mari bridge is useful if you have a 3D comp already setup in Nuke and you want to go back and forth. There are actually other ways to lineup the camera. Getting a Red Epic camera that will shot HDR's per frame, then tracking that footage would be the best. In the end you need to work with Quads only, so there will be some work involved. I actually just got done shooting something 5 times bigger than the apartment and it went a lot better. We used a Lecia survey head to gather 3D points, then model accurately from that. I then moved around with the camera and laptop grabbing 22 stops of range with over 1,500 exposures. I have another crew that is lining up the cameras and building geo as they go that's more accurate than what I did in those videos. To be honest, 2 days to do lineup of camera and model isn't that bad. I actually renamed my HDR image sequnce from 1-300 so every frame is a new camera position.




        The nuke file is by hand, but because I shot much more range I didn't need to adjust.. Also the whitebalance can be adjusted for all images using Photomatrix when in batch mode.. This process is really quick. I actually run Batch bracket merge in Photomatrix to the whole directory, then I rename everything so it goes hdrphoto.001.exr etc.. I then load that sequence into Nuke and do the undistort, and write it out again as undistort_hdr_001.exr, and then I do a tone mapped version where I keyframe the exposure in nuke and write out a jpg sequence that's used to lineup..

        I don't actually know about the simple mode with skylight portal.. I haven't used it. Perhaps Vlado or someone else can explain it.
        What I was thinking with the Mari bridge was more along the lines if your 3D camera and plates line up perfectly with your 3D Maya geometry. What you could do is save out that animated camera like you did with your workflow in the video, then bring that into Nuke with the HDR "sequence". That way when you bridge with Mari from Nuke, you get an automatically, perfectly positioned image to do paint-through work with. It might be able to save a lot of time and increase precision over having to manually place the textures...though of course for it to work really well, it probably means twice the amount of time spent in Maya to make sure the cameras are perfectly lined up with the geo. In scenes with a lot of hard edges and 90deg corners it should be pretty painless.

        If you had the Red Epic though, this would work extremely well. Feed a track into Nuke with the HDR sequence as well and just bridge it over to Mari every time you see a nice frame that you like or that would let you patch in some missing details.

        But I don't really want to think of how much it costs to rent an Epic for a half day to do this. If you already have one on set, then it's definitely an incredible way to get perfectly integrated CG into live action shots.

        For most scenes, I bet you can probably get away with shooting slightly lower exposure on a regular Red One just to avoid blow outs. With the new MX sensor and the high bit depth that it already captures, it'd be "good enough". Could always take a few bracketed DSLR snaps on top of that just to really fill in any areas like windows that really need HDR exposures...or I bet you could just take DSLR video to track and paint a point cloud, then position bracketed shots.

        Man your videos really have me thinking about a lot of fun stuff to try now!
        Last edited by kweechy; 21-08-2011, 04:02 PM.

        Comment


        • #19
          Well if you are modeling your geometry in Nuke, then I could see how you would want to line everything up there, but it really depends on where you get your geometry from. I guess you could do it from any software application. I can very quickly align a camera in maya, along with model to camera so that's where I like to do it. I would love to see your finished work with it! It's pretty cool to be able to paint so much information into ptex, then have it render with V-Ray.

          Comment


          • #20
            Hi Scott,

            I've just finished watching these fantastic videos, really great information and it ties neatly into my current fxphd class at the moment.
            Without a doubt this is very inspirational for me, so I'm going to test this out on a 'closed set' with only a few objects in there to keep it simple and the work-flow sussed out. (knowing me, it'll be a matchmoved shot of some sorts)
            No specific questions just yet in regards to what I have seen but you mentioned you had changed something since uploading the videos, can you elaborate on what that was ?

            Many thanks !

            Comment


            • #21
              Hey Rob..

              I will pretty much go over what I am doing now.

              Laptop with DSLRremotePro
              Canon 5D MKII with 20mm prime

              Canon 5D has issues triggering via USB, but you can still do it.

              I usually shoot 6 exposures about 3 stops across in raw. You might have better luck with shutter control using a Proremote using a shutter cable.

              I shoot completely in sequence with no fuckups. Manual settings for everything. I do not re adjust mid point, nor do I change Aperture size.

              Using Photo Matrix you want to use the batch bracketed mode. That will go through all the raw photos in the directory and combined every 6 photos, and you can set the white balance there, or use the camera's. Make sure you shoot auto white balance turned off.

              You will then have a directory with combined hdrs like this: photohdr_1_2_3_4_5_6.hdr etc..

              You want to rename each photo in sequence like this, photo_set_001.001.hdr photo_set_001.002.hdr

              Now you have an image sequence you can load into Nuke. Using NukeX you run undistort analyze to the grid of checker board for the lens yo used, and process the HDR sequence as undistorted_photo_set_001.0001.exr

              Then put an exposure node on the sequence and keyframe a tonemap so you pull out detail for modeling and camera lineup. You then process that.

              Now you need to build the geo. You can do this by any survey you can bring into a modeling package of choice. For instance I just did a shoot where we had a Lecia survey head that gathered points. You could use Lidar, you could use the Microsoft Kinnet, you could also measure by hand.

              One way to survey is to have a measurement of 1 object in every photo you take. You could carry a cube around that is 2x2x2 feet and make sure its in the photo for correct scale. You could take that cube and change into Video mode walking around the cube in a circle capturing your environment. You would then track that footage of your cube and get point data.

              Then model and paint away.. Now since you might have either too dark or too bright of images to paint in Mari, you would want to go to color profiles and adjust the display brightness as you paint.

              Once you have your ptex painted you then bring it into Maya, but this time instead of adjusted the intensity multiply of the ptex color, you would do all of your adjustments via the Physical camera. So ISO, aperture size and FPS of your Maya scene are very important.

              Hope this helps.

              -Scott

              P.S. one thing I used to do was adjust the exposure of HDRS after I shot them. If you are using Physical camera rendering you dont want to do that. You want to correct range you shot with the camera and use the physical camera control in v-ray to do that. Also, when you setup the geometry, make sure you have all the closest geo to the rendered object seperate for everything else. For speeding up rendering and sampling of ptex for GI you will want to take everything furthest away and resize that in Mari after you export. If you have any question as you go, let me know.
              Last edited by Metzger; 22-08-2011, 06:53 PM.

              Comment


              • #22
                Hey, Scott,

                Thank you for sharing your current workflow in detail. It's very intriguing and informative. I am really interested in learning more in this area as we have been moving away from single HDR pano based IBL toward spatially aware IBL solution for VFX.

                I used to take full spherical HDR panorama and utilize its vast FOV for image-based modeling of interior space. When there was now Mari-Nuke bridge and no projection occlusion in Nuke, it was a bit painful texturing the reconstructed geo in Nuke with panoramas. I would like to go back and visit my old workflow to see if I can improve the efficiency. Since taking one or more HDR pano on-set is almost the norm nowadays, have you explored the possibility of using HDR panorama for HDR-textured set reconstruction? I have read that large studio like Cinesite or SPI has in-house tool or incorporate Spheron Cam to sem-automatically take care HDR-textured set reconsturction, but I am also interested in lower-tech/poor man's approach in this regard.

                - Jason
                always curious...

                Comment


                • #23
                  Originally posted by Metzger View Post
                  Here's everything you need to know:

                  http://forums.macrumors.com/showthre...1148259&page=1
                  hey scott,

                  thanks for the info!
                  best regards,
                  sacha

                  Comment


                  • #24
                    Scott:
                    Thanks ever so much for the information, it's really cool of you to post and make the videos !
                    In fact, any more vray/maya videos would be excellent if you have the time (perhaps teach a term at fxphd? 300 level ??? )

                    Comment


                    • #25
                      Originally posted by RobPhoboS View Post
                      Scott:
                      Thanks ever so much for the information, it's really cool of you to post and make the videos !
                      In fact, any more vray/maya videos would be excellent if you have the time (perhaps teach a term at fxphd? 300 level ??? )
                      I can't help second this great idea!
                      always curious...

                      Comment


                      • #26
                        Originally posted by jasonhuang1115 View Post
                        Hey, Scott,

                        Thank you for sharing your current workflow in detail. It's very intriguing and informative. I am really interested in learning more in this area as we have been moving away from single HDR pano based IBL toward spatially aware IBL solution for VFX.

                        I used to take full spherical HDR panorama and utilize its vast FOV for image-based modeling of interior space. When there was now Mari-Nuke bridge and no projection occlusion in Nuke, it was a bit painful texturing the reconstructed geo in Nuke with panoramas. I would like to go back and visit my old workflow to see if I can improve the efficiency. Since taking one or more HDR pano on-set is almost the norm nowadays, have you explored the possibility of using HDR panorama for HDR-textured set reconstruction? I have read that large studio like Cinesite or SPI has in-house tool or incorporate Spheron Cam to sem-automatically take care HDR-textured set reconstruction, but I am also interested in lower-tech/poor man's approach in this regard.

                        - Jason
                        Hey Jason, thanks! So yeah, taking multiple spherical positions would be great, along with non spherical hdrs. The only problem I see is that you would want to spherical project form multiple positions. A few things need to happen first. You need PTEX baking from V-Ray so we could setup a spherical projection in maya, then project that onto geo. You would have to also combine those using mattes in Maya.

                        Or if Mari actually had support to load spherical hdrs and position inside of Mari to paint. I suggested having support to project spherical maps in Mari. It would be great to use both spherical and rectangular images.

                        The one great thing about taking highres rectangular images is that you have great resolution. Depending on the environment, I still use a domelight for the sky and sun.. Everything close to camera I then model. But that really depends on the location.

                        Comment


                        • #27
                          Originally posted by RobPhoboS View Post
                          Scott:
                          Thanks ever so much for the information, it's really cool of you to post and make the videos !
                          In fact, any more vray/maya videos would be excellent if you have the time (perhaps teach a term at fxphd? 300 level ??? )
                          Possibly in the future. I am currently working on a learning workshop that will take place in Colombia at the end of September. www.thevfxtour.com

                          We are show casing VFX production from concept to finished product all the steps. I will be covering PTEX environment painting, asset texturing in Mari, and rendering in V-Ray.

                          To be honest, it's a lot of work to put out this stuff.. After this event I don't know if I will be up for doing anything for a long time, but I will definitely let you know. Hehehe

                          Comment


                          • #28
                            Originally posted by Metzger View Post
                            Possibly in the future. I am currently working on a learning workshop that will take place in Colombia at the end of September. www.thevfxtour.com

                            We are show casing VFX production from concept to finished product all the steps. I will be covering PTEX environment painting, asset texturing in Mari, and rendering in V-Ray.

                            To be honest, it's a lot of work to put out this stuff.. After this event I don't know if I will be up for doing anything for a long time, but I will definitely let you know. Hehehe
                            Cheers for the info Scott !

                            I can understand that it's a huge amount of work, especially if you tutor something like an fxphd term as well but to you or anyone else interested - they could really do with a more advanced class now
                            But as I mentioned, if there are any cool little things you don't mind recording and popping up on vimeo/youtube as and when you can - it's seriously appreciated !

                            Comment


                            • #29
                              very nice tutorial! thanks for sharing
                              showreel: http://vimeo.com/27236919

                              Comment


                              • #30
                                Hey Scott! Long time no see!

                                Loved watching your tutorial. Very very informative and inspiring.

                                Starting to spend more time on here and was very delighted to see your presence!

                                Dave Funston

                                Comment

                                Working...
                                X