Announcement

Collapse
No announcement yet.

Balancing HDRI Intensity

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Balancing HDRI Intensity

    Hi all! Looking for some help or direction regarding building HDRIs "properly".

    We've got some fully bracketed (7 exposure brackets) panos that we're looking to use in V-Ray without having to modify values on the VRayHDRI map nor the Dome Light. The goal is a Overall/Render mult of 1 and a Dome light intensity of 1. To do this, we're using PtGui to stitch the pano and Nuke to color balance and exposure adjust. We've got reference photography too, with grey/chrome balls and color charts, so theoretically we should be able to get a 1:1 match.

    Our issue so far has been that when we bring in the HDRI map, leave all the values at default, and adjust a VRayPhysicalCamera to the same parameters that our reference photography was shot at, the result is significantly darker. Is there an assumed intensity of a Dome Light or for the input map so that it matches the camera exposure correctly?

    Thanks for any help!


  • #2
    I am going to follow this one. Although I don't take my own HDRIs, the one I buy gives the camera settings and I have yet to get it to work in V-Ray.
    Bobby Parker
    www.bobby-parker.com
    e-mail: info@bobby-parker.com
    phone: 2188206812

    My current hardware setup:
    • Ryzen 9 5900x CPU
    • 128gb Vengeance RGB Pro RAM
    • NVIDIA GeForce RTX 4090 X2
    • ​Windows 11 Pro

    Comment


    • #3
      I do like this on my own hdr(bear in mind this procedure is intended only for intensity):
      -I load my hdr inside a dome light leaving intensity at default value
      -I set my Vray Physical Camera as my real camera(of course for producing hdr I use several bracket but only one have the correct exposure, I choose that one)
      -I load my reference 3d geometry inside the software(I have builded a small real world reference system using several colored sphere, chrome, glass and wood)
      -Once rendered the scene may look too bright or too dark compared to the real shot, so I take my hdr in PS(or any other tool capable of 32bit editing) and rise or lower the exposure value until the render match the real shot(usually it take a couple test render or more to get it right)
      -Now the hdr intensity is calibrated, you can leave the default intensity and use that with real camera values, also for the interiors it will work consistently togheter with other artificial light sources(also calibrated with real values in lumens)

      It was very fun to produce hdr and I was also planning to sell them but unfortunately I do not have spare time anymore and I've ended up using always the few I've produced a while back.
      Here is an example, on the left the real reference on the right the 3d counterpart:

      Website: http://www.3drenderandbeyond.com
      Shop: http://www.3dtutorialandbeyond.com/
      FB: https://www.facebook.com/3drenderandbeyond/

      Comment


      • #4
        So!

        Photos are always relative measures, not absolute measures. The camera is adjusting it's exposure settings to take the scene in front of it and make it's average fit at around middle grey. this means it doesn't matter how bright the original light was, the camera has boosted it up or cut it down when recording the image. It's the exact same as our eye does, it opens up or closes down to suit the brightness level of the environment level that we're in. As Sirio has mentioned, we need to have a real world reference, a cg version of the same reference and then adjust the intensity of our hdr's exposure value until we get a visual match between our real world ref and our cg test render. What'd be ideal is if we went to a paint shop and got them to make us a can of spray paint that matches the middle grey macbeth chart value and then painted a sphere to use as our grey ball. Even better would be to get chaosgroup to scan this colour and have a macbeth middle grey vrscan that we could use as a target.

        @thwalker

        It's highly likely that you haven't fully captured the full range of the sky in your hdri shoot. 7 brackets is the standard that we use also since it's fast but it's not enough to capture the intensity of direct sun. It it's a bit cloudy you might be able to get away with using a really low iso and f22 1/8000 but we've had to use an nd filter in front of the lens to darken it further. Typically what we'd also do is cut the sun out of the hdri and have two lights in our 3d scene, one dome for sky and a direct for sun so at least it can be adjusted. A mirror ball is never 100% reflective (often around 50%) so you can use a heavily underexposed picture of this from your real world shoot to see if you've got your sun at the correct level. What we'll also often do is for the reference shoot, put some sort of flag / light blocker up between the sun and the reference spheres so you get them shot with only the light from the sky - then we can calibrate the two separately.

        I've researched a lot of measuring / exposure things over the years like luxmeters (light values) and spectrophotometers (combined diffuse + spec values of objects or their "brightness / colour" as we'd see it) and all of them have a bit of inaccuracy in there. If you're buying cheap equipment (20 dollar ebay luxmeter for example) then you might be 10 or 15% inaccurate in your measurements. To get to maybe 5-8% inaccurate is a really big jump forward in money and to get less than 5% is an even bigger jump. The middle grey exposure of your camera is pretty accurate and the macbeth chart grey is also well known so I reckon the cheapest way to get a good hdr is having a know target. It's really tough to get a full range sun so maybe get as much range as you can, cut out the sun from your hdr and then adjust it separately so that your cg render of your ref objects match your ref shoot.

        Comment


        • #5
          Here's a great write up - https://blog.hdrihaven.com/how-to-cr...-quality-hdri/

          Comment


          • #6
            Thanks joconnell ! Our eventual goal is to, as you said, cut the sun from the HDRI and control it with a little rig. The problem at the moment has been our evaluation of what is a "correct" value, for our series of HDRIs we've shot.

            Appreciate the overview and link. I'll dig through this, talk with some of our folks, and see if we can re-approach our situation.

            Comment


            • #7
              Weta have a whole other level of shooting their lighting reference (spectral profiling of on set lights) but every other really high end place still relies on a "close enough" approach with a bit of dialing to taste. What's going to make a big difference is having a good reference library for diffuse values - if they're in the right place then lights will behave realistically with them and your shaders will be consistent whether in light or dark. They had a new approach on toy story 4 where they based everything on accurate diffuse values (just shoot textures with a grey card or middle macbeth grey as your exposure target) and then also recorded correct lux values for various light sources. One of the lighting staff went to a peak in california on a clear day and recorded the lux value of the sun from morning til night to get a scale, they also did this for various bulb types. Even if their luxmeter was a little inaccurate, if they made sure to shoot the entire collection with the same meter at least it'd be "evenly" inaccurate. The key point was for them to know what factor each bulb was relative to another. The sun might be 200000 lux at mid day and a standard desk bulb might be 2000 so they just made sure that their light multiplier made their sun 200 times brighter than the bulb with all the other lights falling in between. This meant they could have an interior lit by bulbs with the sun outside and it'd all be in the correct ratio. Likewise all their materials would work regardless of day or night scenes.
              Last edited by joconnell; 31-01-2020, 11:09 AM. Reason: Stuck in the work lumen (outgoing light) for lux (incoming light) by accident - thanks for the catch sirio!

              Comment


              • #8
                Be careful to not mix lux and lumens, are two very different values
                3D Scenes, Shaders and Courses for V-ray and Corona
                NEW V-Ray 5 Metal Shader Bundle (C4D/Max): https://www.3dtutorialandbeyond.com/...ders-cinema4d/
                www.3dtutorialandbeyond.com
                @3drenderandbeyond on social media @3DRnB Twitter

                Comment


                • #9
                  Thanks for the insights joconnell . To hear Weta digital talking about it, just going that last step and mapping everything spectrally made a huge difference, so I imagine going from literally just guessing (which is what I rely on now) to at least some degree of sanity checking values and colors would at the very least be an equal improvement.

                  Do you know of any more resources for doing this any of this in practice? Like going all the way from capture to a calibrated result in a render engine? The HDR one you linked was very helpful
                  __
                  https://surfaceimperfections.com/

                  Comment


                  • #10
                    Originally posted by sirio76 View Post
                    Be careful to not mix lux and lumens, are two very different values
                    Edited to avoid confusion - thanks!

                    dgruwier there's two parts to it - one was profiling the sensor of the shoot camera, the second was the lights themselves. For the cameras, you can buy a spectral light which gives you control over the wavelength of the light (it's a panel that emits one pure colour) and it's intensity. You cycle through the visible spectrum say every 10 or 20 nanometers (from 380 to 740) using the same intensity value and record with the camera. If the sensor was recording everything perfectly, you'd get the same luminance in the recorded pictures across every part of the wavelength but of course our sensors are recording in rgb and unevenly. If you find part of the wavelength that's weaker, you can add in a bit of a cc to boost that up a tad and restore the lost values. Likewise we do the same thing with our lights but using something like a spectrophotometer that'll measure the incoming light values and give us an accurate intensity and hue.

                    From weta's point of view, if they only used the shoot cameras as ref with standard hdrs, they'd be feeding data into their 3d scenes that were missing some bits of extra brightness so things were darker in some aspects. During the shoot, all those extra wavelengths were bouncing around with the shoot camera dulling them down on recording but they would have had an influence. I'm mainly thinking of things like light hitting the back of hair or skin feeding into the sss or scatter calculations. Bear in mind that their renderer is spectral also from the off, I'm not sure if rgb renderers can use all of the extra data or it gets lost on conversion to rgb values internally.

                    Comment


                    • #11
                      Where can I read or see more of the Weta Digital approach?
                      https://www.behance.net/Oliver_Kossatz

                      Comment


                      • #12
                        Some stuff here, will see if I can find links in our work mail - https://www.fxguide.com/fxfeatured/p...-weta-digital/

                        Comment


                        • #13
                          joconnell I had no idea spectral lights like that exist? Are they just a bunch of separate calibrated bulbs?
                          That being said, I wasn't asking about spectral rendering (though I appreciate the more detailed breakdown, fascinating stuff!). I was really just talking about doing rough calibration using macbeth charts and luxmeters. I've had trouble finding authoritative guides on that subject.
                          __
                          https://surfaceimperfections.com/

                          Comment

                          Working...
                          X