Announcement

Collapse
No announcement yet.

Deep Compositing of Glass Materials

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Deep Compositing of Glass Materials

    Hello all,

    I am trying to bring deep compositing to our studio's workflow using VRay's 3 new capability.

    I am though finding what I understand are some issues in the comping of glass where this hapens to be in front of opaque objects. We work with interiors so this situation happens a lot. We always have glass displays or window shops with products and walls behind them that will give as a result a complete white alpha.

    When I place cardboards in Nuke combined with VRay's Deep data glass works well only when there are no objects behind it. The cardboards get completely occluded when placed in the 3D space between a glass object and anything behind it which is opaque. Please see the problem in this image:

    Click image for larger version

Name:	DeepComp_thread.jpg
Views:	1
Size:	453.5 KB
ID:	880442

    What I was expecting from VRay's deep data is a range of transparency values spread across the 3d space that will also take into account the Refraction value in the VRayMaterial, but for some reason this is not happening. It looks like the deep points representing the glass objects take a unique transparency value corresponding to the one of the objects behind them, instead of a range of transparency values spread along the space. The option Affect All Channels is active in the material's Refraction, the glass objects are contributing to the Alpha Channel.

    As I understand, Deep data stores a Transparency Function per pixel, which I reckon is a range of transparency values associated to 3D coordinates represented in Camera Space. This should offer an "automatic" layering process, but it is just not happening. The only solution I find now is going back to render masks for transparent objects (just as I was doing in the traditional workflow) or using Object IDs to isolate glass from deep data points, substract them and add them again flattened after addging the cardbords... Not very different from what it was before deep data came to our lives.

    Am I forgeting to change some parameters or am I just expecting too much from deep compositing? This is very new to me and I am sure I must be doing something wrong. Can anyone help, please?

    Much apreciated,
    Jordi Canela

  • #2
    I'm afraid I don't know the answer; you might have more luck on the Nuke forums. Nothing that I've read about deep images ever mentions refractions, only transparency (which is quite different). Compositing deep data assumes a fixed relationship between the screen coordinates of a deep point, its z-depth, and its actual position in the scene (namely, perspective projection). However refraction bends the camera rays and this relationship is broken.

    Things should work better if you use simple transparency instead.

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      Thanks for the reply, Vlado. Though this is not a Nuke question, but a VRay question.

      I wanted to know if there is a way to make VRay to interpret refraction values as transparency values under any condition in regards to deep data. At present this is happening only in some cases:

      In the test image I posted you can see how Refraction values are treated as transparency values when there is no background behind the glass object (see the woman with the pink shirt on the left, perfectly occluded by the glass, with only the reflections occluding her and the luminosity being droped down by the glass light absortion). This is how I expect deep data to work. This is not happening when behind the glass we have opaque objects. In that case VRay treats the glass as opaque values when creating the deep information (see the two girls in the middle of the frame: the girl behind the glass "dissapears" behind it; only where there is no wall behind we can see her feet again). I hope this explains my point a bit better. Otherwise I will try to post a clearer image.

      To be able to have all the benefits of deep data in postproduction my understanding is that refraction values should always be treated as transparency values when creating the deep channel.

      This will be very beneficial compositing wise, even if the objects placed in post behind the glass will not be distorted by the refraction effect automatically. This is preferable over having objects dissapearing when they shouldn't. Distortion can be applied later if needed, but if the object inserted in post is dissapearing where it shouldn't this has a much more difficult solution.

      On the other hand, if we used opacity values instead of refraction values for our glass materials, as you suggest, we will loose the refraction effect and the fog value inside the glass making our glass to look less nice and letting us with the task to deal with that in post too.

      To summarize: I think it will be very interesting, if not mandatory, to enable VRay Renderer to treat VRayMaterial's Refraction values as transparency values in any situation for Deep data creation, whilst maintaining the present representation for the RGB channels (maintaining the distortion created by glass in the beauty and refraction pases). Otherwise, the oportunities to use deep data coming from VRay will be reduced to a smaller number of cases.

      Is this something we can expect to have in the next update?

      Thank you,
      Jordi Canela

      Comment


      • #4
        Interesting suggestions Jordi,
        I was thinking a similar thing. It may not be 'correct' but it certainly would work better in 90% of cases for comp.
        Vlado, would it even be possible to do this? Vray would refract as usual, but the deep sample would exist without refraction; ie at the correct ray depth, but not offset in space.
        Patrick Macdonald
        Lighting TD : http://reformstudios.com Developer of "Mission Control", the spreadsheet editor for 3ds Max http://reformstudios.com/mission-control-for-3ds-max/



        Comment


        • #5
          Well that should already work if you set the "affect channels" option for the refractive material to "all channels", no?

          Best regards,
          Vlado
          I only act like I know everything, Rogers.

          Comment


          • #6
            You're expecting too much, it shouldn't actually be in the deep samples in my opinion. I don't see a technical reason it couldn't be, but I can see a lot of practical reasons why you wouldn't want it to be. The Deep Pixel doesn't really know where the refraction takes place and placing the refracted sample "behind" the glass at the refracted distance along the camera ray isn't really correct since refraction would bend the ray. If you actually assigned a depth value to a refraction sample then when you did a deep composite you would have a very very strange point cloud when you have like a glass ball that is refracting a spherical set of samples but then you're compressing them into a bizzare convex sphere behind the glass ball. So now you have this weird displaced convex ball of samples and you want to apply DOF. The depth sample for a refracted sphere might be of something 100 feet behind the ball so it gets a z-depth value of 100' and is thrown way out of focus. But that's not physically correct since the refractive orb is acting like a lens and throwing off all DOF calculations that you would try to do in post not to mention it's extremely anisotropic. Or again imagine your concave ball and you try to composite in a card behind the ball. A straight up deep composite will simply place an unrefracted object between the ball and the deep samples behind it... that't not what you want, you need the card behind the orb to be shrunk down or scaled up depending on the refraction and lensing done by the sphere. In other words ultimately what you need is a raytracer.

            To solve your immediate problem though what you want to do is to use a refraction mask and then rebuild the glass shader from the refraction up using passes old-school multi-pass comp techniques.
            Last edited by im.thatoneguy; 26-08-2015, 10:47 AM.
            Gavin Greenwalt
            im.thatoneguy[at]gmail.com || Gavin[at]SFStudios.com
            Straightface Studios

            Comment


            • #7
              Well, with affect all channels, what happens to our data elements; depth, etc?

              I can see the problems with DOF too, but as I'm only advocating the option to have the surface behave as opacity, then if DOF is a problem, then you can stick with the default setup where the rays are opaque, but then, this is surely a moot point as it is, your DOF won't be correct anyway, as the depth will be set by the front face of the transparent object in your deep render, not the depth of the distant objects.
              Patrick Macdonald
              Lighting TD : http://reformstudios.com Developer of "Mission Control", the spreadsheet editor for 3ds Max http://reformstudios.com/mission-control-for-3ds-max/



              Comment

              Working...
              X