Announcement

Collapse
No announcement yet.

Ideal lighting conditions for a material testing scene.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Thanks Tim. Appreciate you spent the time to explain the concept behind.
    always curious...

    Comment


    • #17
      Definitely knicking that idea.

      Comment


      • #18
        I´m just making the same thing, xD. The tricky parts to me it´s making sure the HDRI it´s made from pure lineal images and were to buy a mid grey ball for reference. I´m testing 2 options, one using a profile generated using the x-rite colorchecker software and light room disabling all the post effects, and i want to try this: http://mikeboers.com/blog/2013/11/07...aw-conversions

        The other problem i have it´s dynamic range, i´m using 5D Mk II and a Sigma 8mm fisheye so i can´t use ND filters. Sometimes, f22 ISO100 V1/8000 it´s not enough to capture all the dynamic range (well it´s enough but not really accurate y usually want more).

        I will make some final test once i get some kind of reference ball or material and share my thoughts here.
        My Spanish tutorial channel: https://www.youtube.com/adanmq
        Instagram: https://www.instagram.com/3dcollective/

        Comment


        • #19
          Originally posted by adanmq View Post
          I´m just making the same thing, xD. The tricky parts to me it´s making sure the HDRI it´s made from pure lineal images and were to buy a mid grey ball for reference. I´m testing 2 options, one using a profile generated using the x-rite colorchecker software and light room disabling all the post effects, and i want to try this: http://mikeboers.com/blog/2013/11/07...aw-conversions

          The other problem i have it´s dynamic range, i´m using 5D Mk II and a Sigma 8mm fisheye so i can´t use ND filters. Sometimes, f22 ISO100 V1/8000 it´s not enough to capture all the dynamic range (well it´s enough but not really accurate y usually want more).

          I will make some final test once i get some kind of reference ball or material and share my thoughts here.
          Do you need photos in linear colorspace to achieve a correct HDRI? Or just make sure you're using the right colorspace type in the VrayHDRI and Vray will make the corrections for you?
          Brendan Coyle | www.brendancoyle.com

          Comment


          • #20
            the key to a correct HDRI is purely shooting it with a gray reference, and ideally another two known points (a second gray reference for the visible part, and a known-intensity light source for the HD part),and working around that data to achieve a dome which will light a gray patch in LWF conditions to exactly the same intensity as the shot reference.
            However, this type of approach introduce arbitrary (if perfectly captured) lighting into the look-dev phase, with potentially disastrous results (The case made by Seraph is one.).

            Personally, I prepare my shaders ENTIRELY under GanzFeld lighting (Spherical IBL with 1.0F white color, in modern CG lingo), and my references are always shot with a colorChecker by the side, so the lighting gets neutralised.
            Only after my albedo coefficients, and if i had the polarised refs, the different components of the shader, have been matched to the reference, i move the shader unto the model, into a light rig or ten.

            This has two benefits, for me:
            *) Lighting is neutral, the colors i pick in my VFB ARE the shader colors.
            *) Any value on the test sphere exceeding 1.0f is a sure sign of energy creation by the shader, while any suspect darkening at the edges shows energy losses. Both need to be corrected, and ONLY under GanzFeld lighting can this be done both accurately and very directly (with a custom light rig, one would have to divide by the light's intensity and colour, and so on...).

            This method has been widely tested in VFX and ArchViz, with predictable, consistent, and visually appealing results.
            In some instance, the model and shaders were given to a different location, or studio, entirely, and even converted and rendered with another engine, and the results remained correct.

            I used to keep these as my very own trade secrets, but apparently i'm not working in production for a while, so here, i spilt me beans.
            Lele
            Trouble Stirrer in RnD @ Chaos
            ----------------------
            emanuele.lecchi@chaos.com

            Disclaimer:
            The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

            Comment


            • #21
              Very interesting Lele!

              I'm not sure to get the whole process right, would it be possible to have a few screens of the process you describe.
              Mainly for this part : "GanzFeld lighting (Spherical IBL with 1.0F white color, in modern CG lingo)"

              Cheers
              Stan
              3LP Team

              Comment


              • #22
                A screenshot of the UI, with the render under lighting conditions (the dome settings are shown on the right. rest is at default.)
                Click image for larger version

Name:	GanzFeld_UI.jpg
Views:	1
Size:	518.4 KB
ID:	856363
                While the render looks very uninteresting, that's the whole point of the exercise, as the render elements reveal (diffuse filter, reflection and specular, in order.):
                Click image for larger version

Name:	diffuseFilter.jpg
Views:	1
Size:	84.6 KB
ID:	856362
                Click image for larger version

Name:	reflection.jpg
Views:	1
Size:	78.8 KB
ID:	856364
                Click image for larger version

Name:	specular.jpg
Views:	1
Size:	64.4 KB
ID:	856365

                Here is the render with an HDRI texture in the dome.
                Click image for larger version

Name:	IBL.jpg
Views:	1
Size:	164.7 KB
ID:	856366
                Lele
                Trouble Stirrer in RnD @ Chaos
                ----------------------
                emanuele.lecchi@chaos.com

                Disclaimer:
                The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                Comment


                • #23
                  Looks great, but I'm not sure to understand how this is any different process than loading a HDRI in a VrayDome and do the shaders as usual.
                  Am I missing something?

                  Stan
                  3LP Team

                  Comment


                  • #24
                    Originally posted by 3LP View Post
                    Very interesting Lele!

                    I'm not sure to get the whole process right, would it be possible to have a few screens of the process you describe.
                    Mainly for this part : "GanzFeld lighting (Spherical IBL with 1.0F white color, in modern CG lingo)"

                    Cheers
                    Stan
                    It's simple, Lele is simply channelling his extrasensory perception to accurately calibrate his shaders. Don't you do this too?!

                    https://en.wikipedia.org/?title=Ganzfeld_experiment

                    A ganzfeld experiment (from the German for “entire field”) is a technique used in parapsychology which claims to be able to test individuals for extrasensory perception (ESP). The ganzfeld experiments are among the most recent in parapsychology for testing telepathy.[1]

                    Patrick Macdonald
                    Lighting TD : http://reformstudios.com Developer of "Mission Control", the spreadsheet editor for 3ds Max http://reformstudios.com/mission-control-for-3ds-max/



                    Comment


                    • #25
                      Eheh, GanzFeld simply means "total field" in German.
                      So it's used in a number of fields, psychology for one.
                      The substantive however, is "Lighting".

                      The difference is very simple: your L term in the shading pipeline is always 1.0, making shader evaluation an exact science, something which is hard to impossible to do with an arbitrary (if calibrated), and varying, HDRI dome.
                      Render elements do come to the aid, but they won't return exactly the same results as this method does (for instance, the specular term in the example above contains both the specular amount, and its glossiness, in one, across the whole object domain, and not restricted to lit areas.).

                      here's where i got the initial idea from (Ch. 9.1.2):
                      https://books.google.it/books?id=OW0...ffects&f=false
                      Last edited by ^Lele^; 22-06-2015, 07:44 AM.
                      Lele
                      Trouble Stirrer in RnD @ Chaos
                      ----------------------
                      emanuele.lecchi@chaos.com

                      Disclaimer:
                      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                      Comment


                      • #26
                        Originally posted by 3LP View Post
                        Looks great, but I'm not sure to understand how this is any different process than loading a HDRI in a VrayDome and do the shaders as usual.
                        Am I missing something?

                        Stan
                        Lele just uses a dome light in full spherical mode with a white diffuse as his starting test point cos he's a purist. I think he's 1/8 swiss? He authors his shaders in a white environment first and then moves them into hdri's to see how they behave. Only bad point is you can't shoot reference materials in such light unless you've a fancy ganzfield room. You also have the worry that if you do have a pure white room to shoot things in, you might be a figment of Stanley Kubrick's imagination.

                        Comment


                        • #27
                          Fair points, John.
                          However, some reference comes with precise color values and albedo values, however those were measured in the first place.
                          Then figuring out if the shader is outputting those without the 1/8th swiss in me becomes difficult allright.
                          In other words, it mimics something impossible to shoot in RL (you'll always have something suspending your object, and shadowing it), but it provides exceedingly accurate shader measurements.
                          When those are clear, the dropping of an HDRI into the dome (especially if calibrated and well known) hardly ever presents surprises, tbh.
                          Lele
                          Trouble Stirrer in RnD @ Chaos
                          ----------------------
                          emanuele.lecchi@chaos.com

                          Disclaimer:
                          The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                          Comment


                          • #28
                            Yep - I'd much rather measured stuff as a starting point - I'm kind of excited for the quixel megascans database, if not for the actual textures but more as a reference database of albedo and specular values for a range of different surfaces. I've got a monitor calibrator here that also works as a spectrophotometer so I can get fairly accurate rgb values for a surface. I suspect it doesn't work well on bumpy or curved surfaces though and I'm not sure how to estimate the reflectivity that I have to remove from the data but it's possible better than an educated guess!

                            Comment


                            • #29
                              Right, so it's basically go back to what we did 8 years ago, making a shader based on a white dome (ok it's spherical and not half dome), in stead of a HDRI.
                              This will prevent you to push reflection or whatever setting too far as a HDRI could miss lead you.
                              Is this a revolution or am I really missing something here?

                              Stan
                              3LP Team

                              Comment


                              • #30
                                I'm sorry, this is probably a dumb question. But why does Lele's screenshot show a texture in the Vray Light if the procedure supposedly uses only a solid white light?

                                Is it that he toggles the texture on and off?

                                Comment

                                Working...
                                X