Announcement

Collapse
No announcement yet.

Using Readings From Gloss Meters

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Using Readings From Gloss Meters

    I'm trying to find a way to use the data recorded from a gloss meter or spectrometer when creating materials. Our client often provides us with this data and while we can eye-ball it, it'd be helpful to have some sort of correlation.

    For example, a soft-touch/rubberize surface has a gloss reading of about 5gu (gloss units), a satin finish plastic or bead blast anodized aluminum has a reading of 17gu. By eye we find this to the soft-touch to be about 0.6 reflective glossiness, satin plastic closer to 0.75, but with the different in conditions before holding a small sample chip and our shader ball that's just a baseline guess.

    I've looked all over for information on this and have come up empty.

    Thanks

  • #2
    Did you look into VRscans?
    Lele
    Trouble Stirrer in RnD @ Chaos
    ----------------------
    emanuele.lecchi@chaos.com

    Disclaimer:
    The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

    Comment


    • #3
      We have and that's an avenue we're exploring, but there are limits to what we can have scanned. I personally would like for us to get some scans to use as baselines that we can see in our shader-ball scene, a controlled scenario we can reverse engineer for creating templates.

      I surprised there's almost no discussion of gloss meter data being used to create materials. Every material sample we receive from the client has this data and gloss meters are pretty accessible these days.

      Comment


      • #4
        It's a very simple descriptor for a material's specular lobe: a single number can't possibly represent a material's space-varying (ie. textured) reflectance.
        As you well say, there's eyeballing involved, so it's more work than eyeballing gloss amount as well.

        You could get a type of VRScan resembling your material, and then proceed to texture ("Paint") it.
        Or, sure, reverse engineer a rough version of it.
        Lele
        Trouble Stirrer in RnD @ Chaos
        ----------------------
        emanuele.lecchi@chaos.com

        Disclaimer:
        The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

        Comment


        • #5
          We're quite interested in this at work but likely using a different method - some kind of measured light and camera on an arm that can move from 0 - 90 and be automated. Matt it seems like most gloss meters have three angles of recording on them; 20, 60 and 85 degrees. It kinda seems like this wouldn't give us an amazing data set to understand how a highlight is tapering off, especially for "hazy" materials as described on some of the manufacturers sites. We've made a few objects painted with various finishes and had vrscans made of them as a way of having a more accurate version of the on set silver ball for vfx so I wonder would it be worth your while getting a range of sample objects from fully glossy to zero glossiness maybe in ten percent increments according to your gloss meter, getting vrscans made of those and then recreating those materials using the vrscan as a target? then at least if you got an object that you have a GU reading for, you'd at least have a good starting point?

          Comment


          • #6
            Originally posted by joconnell View Post
            We're quite interested in this at work but likely using a different method - some kind of measured light and camera on an arm that can move from 0 - 90 and be automated. Matt it seems like most gloss meters have three angles of recording on them; 20, 60 and 85 degrees. It kinda seems like this wouldn't give us an amazing data set to understand how a highlight is tapering off, especially for "hazy" materials as described on some of the manufacturers sites. We've made a few objects painted with various finishes and had vrscans made of them as a way of having a more accurate version of the on set silver ball for vfx so I wonder would it be worth your while getting a range of sample objects from fully glossy to zero glossiness maybe in ten percent increments according to your gloss meter, getting vrscans made of those and then recreating those materials using the vrscan as a target? then at least if you got an object that you have a GU reading for, you'd at least have a good starting point?
            Commonly, those angles aren't used on all materials but on specific gloss ranges, 60 is most common, used on semi-gloss (~10-70 gu), 85 for more matte (<10 gu) and 20 for high gloss (>70 gu). This information is usually part of a larger subset derived from a spectrometer or similar equipment. You can read more here:
            https://www.rhopointinstruments.com/...measure-gloss/

            Comment


            • #7
              Originally posted by ^Lele^ View Post
              It's a very simple descriptor for a material's specular lobe: a single number can't possibly represent a material's space-varying (ie. textured) reflectance.
              I'm not looking to re-invent the wheel with the information,just for a possible correlation before GU and the Reflective Glossiness parameter.

              I understand other things will come into play as well, IOR, GGX tail falloff etc, but we're doing consumer electronic visualization, the materials aren't that complicated, generally simple textured plastic or bead-blasted aluminum, no odd coatings etc, so there's no need for complex reflection scenarios.

              Comment


              • #8
                Originally posted by MattFink View Post

                Commonly, those angles aren't used on all materials but on specific gloss ranges, 60 is most common, used on semi-gloss (~10-70 gu), 85 for more matte (<10 gu) and 20 for high gloss (>70 gu). This information is usually part of a larger subset derived from a spectrometer or similar equipment. You can read more here:
                https://www.rhopointinstruments.com/...measure-gloss/
                I've got a spectrophotometer myself for surface matching which gives me back a total rgb / reflectivity value so I was very interested in what a glossmeter could do to add to my process. From reading up a bit it seems like the glossmeter gives you back a single percentage of reflection value so rather than it telling you the focus / spread of a highlight, it merely tells you how bright it is as a percentage of the light hitting it. This reading would give us a highlight intensity value but no info about the falloff or shape of the highlight. I think ikea had a process where their studio takes material samples and uses a measured bulb and a camera mounted on an arm to take photos at 4 points along a 90 degree arc and they use this to recreate a similar spread of highlight.

                More reading and thinking to be done!

                Comment


                • #9
                  Originally posted by MattFink View Post

                  I'm not looking to re-invent the wheel with the information,just for a possible correlation before GU and the Reflective Glossiness parameter.

                  I understand other things will come into play as well, IOR, GGX tail falloff etc, but we're doing consumer electronic visualization, the materials aren't that complicated, generally simple textured plastic or bead-blasted aluminum, no odd coatings etc, so there's no need for complex reflection scenarios.
                  Yeah, i was answering to your question about why it wasn't that extensively used.
                  Even getting a set of measurements would work if the material was homogeneous at the scale you are measuring the gloss amount at: if it had a visible, non-repeating texture, you'd quickly be out of luck LERPing.
                  Notice you could average (with VRScans at least) and symmetrize the material representation, which ought to behave nicely enough for you to principle.
                  F.e. here
                  Lele
                  Trouble Stirrer in RnD @ Chaos
                  ----------------------
                  emanuele.lecchi@chaos.com

                  Disclaimer:
                  The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                  Comment


                  • #10
                    Originally posted by ^Lele^ View Post

                    Yeah, i was answering to your question about why it wasn't that extensively used.
                    Even getting a set of measurements would work if the material was homogeneous at the scale you are measuring the gloss amount at: if it had a visible, non-repeating texture, you'd quickly be out of luck LERPing.
                    Notice you could average (with VRScans at least) and symmetrize the material representation, which ought to behave nicely enough for you to principle.
                    F.e. here
                    Patterns are not typical in our application. I was hoping there was some correlation that could be drawn between Refractive Glossiness in the Maya's Vray Mtl and the Gloss Units the manufacturer was providing but unfortunately it seems as though that's not the case. I appreciate the responses. I am pushing to have some samples scanned.

                    Comment


                    • #11
                      Originally posted by MattFink View Post

                      Patterns are not typical in our application. I was hoping there was some correlation that could be drawn between Refractive Glossiness in the Maya's Vray Mtl and the Gloss Units the manufacturer was providing but unfortunately it seems as though that's not the case. I appreciate the responses. I am pushing to have some samples scanned.
                      I hear you, and should thank you for the assumption of intellectual honesty on my part.
                      VRScans are the culmination of many years of research on the very topic, and it's no accident i keep referring to them.
                      As you well pointed out, even "simple" principling can do little with just a gloss reading, as IoR and most importantly the BRDF shape will determine looks, ultimately.
                      As for homogeneity, or isotropy, as opposed to point-varying properties across the domain, i guess it's just a case of scale.
                      VRScans get down to 31nm, so everything has a pattern, at that resolution.
                      Sure enough they do converge to something more uniform and resembling a principled BRDF when observed from farther away.

                      Besides custom scanning, is there nothing at all which you could use in the library of scans?
                      Lele
                      Trouble Stirrer in RnD @ Chaos
                      ----------------------
                      emanuele.lecchi@chaos.com

                      Disclaimer:
                      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                      Comment


                      • #12
                        Originally posted by joconnell View Post

                        I've got a spectrophotometer myself for surface matching which gives me back a total rgb / reflectivity value
                        Not to derail the discussion, can you tell me more about what equipment you have for spectrophotometry and how you use it? I've been really curious about using something like that lately, but I'm having trouble determining exactly what kind of equipment makes sense, and how I would process the data coming out of it
                        __
                        https://surfaceimperfections.com/

                        Comment


                        • #13
                          I've got two, a cube sensor and also a monitor calibrator that has a spectrophotometer in it. You pretty much put it on the thing you want to sample, press a button and it'll give you an rgb value for whatever the sensor is on. It's not as smart as being able to split it into diffuse and spec or anything nice like that but you'll get an accurate overall brightness and hue for your material. I bought both of these before I changed studios and the far far smarter people than me in there decided on the nix sensor as it'll also give you acesCG (aka linear) values. Grab one here, I've seen deals on them from time to time too - https://nixmini.com/product/nix-mini-color-sensor/ - The other thing you can get is a cheap luxmeter which gives you a light value at whatever specific place in your environment you have it pointing out from. Handy for placing a certain distance from a light bulb (really important to note the distance since falloff is important!) and recording exactly how much light is coming out for when you want to recreate it after. With those two and exif data from a camera you'll be in very good shape to recreate environments once you strip off the camera curve from your raw files as they'll skew your colour values from being linear to being a nice picture

                          Comment


                          • #14
                            Originally posted by joconnell View Post
                            I've got two, a cube sensor and also a monitor calibrator that has a spectrophotometer in it. You pretty much put it on the thing you want to sample, press a button and it'll give you an rgb value for whatever the sensor is on. It's not as smart as being able to split it into diffuse and spec or anything nice like that but you'll get an accurate overall brightness and hue for your material. I bought both of these before I changed studios and the far far smarter people than me in there decided on the nix sensor as it'll also give you acesCG (aka linear) values. Grab one here, I've seen deals on them from time to time too - https://nixmini.com/product/nix-mini-color-sensor/ - The other thing you can get is a cheap luxmeter which gives you a light value at whatever specific place in your environment you have it pointing out from. Handy for placing a certain distance from a light bulb (really important to note the distance since falloff is important!) and recording exactly how much light is coming out for when you want to recreate it after. With those two and exif data from a camera you'll be in very good shape to recreate environments once you strip off the camera curve from your raw files as they'll skew your colour values from being linear to being a nice picture
                            ahahah, i giggled, and laughed, and finally LOLled at the neverending list of jumps, skips and hops bewteen color spaces and measurement setups...
                            We need portable VRScanners (which will ofc do the same jumping as above, but for you.). ^^
                            Lele
                            Trouble Stirrer in RnD @ Chaos
                            ----------------------
                            emanuele.lecchi@chaos.com

                            Disclaimer:
                            The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                            Comment


                            • #15
                              It'd be very interesting alright to have something that's taking a small, non tiling patch that can capture quickly as an on set tool. Of course the internal battle rages on about what our renderer is but something like this would be a big plus point, especially if it's fast on set...

                              Comment

                              Working...
                              X