Announcement

Collapse
No announcement yet.

Scanning materials with spectrophotometer and colorimeter

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Scanning materials with spectrophotometer and colorimeter

    Hello,

    Does anyone have any experience of scanning material properties from the real world? We have a client who can provide us with material samples we are replicating in Vray. There seem to a number of scanners on the market (spectrophotometer and colorimeter) to get values for diffuse colour, reflectance, glossiness and also to some degree angle of falloff.

    But I guess the problem would be getting the most out of those readings in Vray. For example this tool can give you various readings of the glossiness. But the values are in other units - like GU (gloss units) and HU (haze units). It also seems to measure falloff at 3 angles: 20, 60, 85 degrees).

    http://www.elcometer.com/en/appearan...ossmeters.html

    I would think these values correspond to reflection glossiness, ggx tail falloff and the falloff curve. 20, 60 & 85°
    I wonder if you can get a good falloff curve from those 3 values?

    My dream situation that could scan as many properties in Vray basic material as possible.

    Any thoughts or suggestions would be greatly appreciated.

  • #2
    As far as I know Chaosgroup are currently working on their own Material Scanner, they have not released much info on the topic, I'm not sure if they have any method or support for other scanners at the moment but here are a few links you can check out for yourself.

    http://docs.chaosgroup.com/display/V...VRayScannedMtl

    http://forums.chaosgroup.com/showthr...terial+scanner

    Hope it helps in some way.

    Cheers.

    Comment


    • #3
      Thanks Chad.

      Yeah - I saw that stuff. Looks really interesting.

      What we are hoping to do is a level below that. We want to pump some ''real world'' values into the Vray standard material. I think we will have to make changes to the materials at various points in the process - so a black box shader wont be much help to us.

      It seems like scanning colours is very possible. But its when the complexities of how all the effects tie into each other things start to get complicated (is that SSS or translucency?)

      Anyone tried using a scanning device before? Also calibrated photography could also be a solution (know light intensity, known camera values, measured light etc...)

      Would be great to hear peoples experiences.

      Will

      Comment


      • #4
        What's the price range for these spectrophotometers? Wondering if the Chaos Group one will fall in same price range...
        I'd probably jump on something like this more than an expensive 3d scanner. Of course having both would be sweet

        Sorry I don't know anything about using the scanning materials in vray yet...
        Brendan Coyle | www.brendancoyle.com

        Comment


        • #5
          Seems like the glossometers for getting reflectance are around 400 pounds uk so not too bad if you do a lot of product stuff! On the spectrophotometer the cheapest option you can probably get is one of the colour munki monitor calibrators that come with a spectrophotometer built in. I've got one here, I'll let you know how accurate it ends up being. most other people would do things like pop in a macbeth colour chart beside the thing that they're trying to use as reference, use the chart to totally neutralize the image and then you'll have a pretty good measure of what your colour is. Some will go further and try and shoot with a circular polarizer which is designed to cut off a frequency of light so it'll remove the spec of an object when turned one way and increase it when turned another. This video is quite handy and he points out that the major advantage of the chart is just getting the brightness of his textures correct - https://vimeo.com/107961975

          Comment


          • #6
            But there seem to be a large number of handheld scanners that are ''affordable'' (as long as I am not paying )

            And I am sure that loads of industries are doing this stuff all the time. For example a automotive car paint shop can match an old weathered paint on a car for a new bit of body work.

            The question is if its possible to take the scanned values and use them in a meaningful way. Vray is physically plausible and what the scanners are scanning are also a model of interaction of light on a surface. But is there away to translate these values into something vray friendly.

            joconnell - Yeah the calibrated photo and lighting is definitely a strong path to head down.

            I like the idea of shooting polarised lights with a polarised lens to get a diffuse pass from a photo. I also see people then minusing that from the non-polarised photo to get a spec pass. I also think photography can do a good job of refraction (just photographing a sample on a defined grid can give really good ref for the ior).

            The other side of it of course is that an expensive scanner gives a client a lot of confidence!

            Comment


            • #7
              Digging this up to see if anyone has any new thoughts on matching materials - whether it's scanning, photographing, etc. I see lots of materials come to my desk for matching in renderings. Would it be possible to get a close match through photographing the samples - and building a virtual HDRI of my desk space / lighting, to translate what I see at my desk, vs what's on screen? Is there a virtual Macbeth chart to go along with the physical one you photograph?
              Brendan Coyle | www.brendancoyle.com

              Comment


              • #8
                Yes - that's what kojima studios did for metal gear solid 5. They'd a boardroom that was inside their building and only had artificial light so it was totally consistent throughout the day. They measured this room, took light readings and shot a hdr with a macbeth chart so they could linearize it. They recreated the room in 3d based off this data and then were able to use this as a lighting setup - any time they had a new material they could just bring it into the boardroom and shoot it, knowing that nothing else had changed in that room and the reference would be useful.

                I've done a macbeth match using a jpeg from the web who's values match what they're supposed to be, a photo of a macbeth chart with an iphone beside it using a luxmeter app to know how much brightness is hitting the chart, then a camera that has exif data in the shot so I know what iso / aperture / shutter the shot was taken with. Everything matched up very closely just by inputting the data, the only thing a luxmeter doesn't get is white balance or the hue of the light. The new cinemeter app for the iphone does pick up colour temperature although I haven't tested it yet to see if it's reliable.

                Edit:

                Further to that, one thing I want to do based on what blizzard does and what I've seen in an fxphd course (mya214 - lighting, well worth it even though he's using mental ray, it's all the other technique outside of 3d that's good) is people just having a tray of material samples - a plastic, a metal, a gold, a wood, a rubber and so on. If it's a small and portable thing, you could bring it along with your macbeth chart and when you shoot your hdri, take a reference shot of your collection of materials in the same light beside your macbeth chart and when you get back to 3d land, try to recreate them. The nice thing is if you're using the same material collection each time you shoot a hdr, you can see if your 3d materials are in the same ballpark in each new lighting scenario as the one you've photographed.
                Last edited by joconnell; 29-01-2016, 03:41 AM.

                Comment


                • #9
                  the biggest problem is the need of a consistent environment, as soon as a small bit is different, you have to start from scratch. having a special room, like kojima studios, is a big luxury but most of the time not possible. i am a big fan of getting all the reference data you can get and a scanned material would be grant, but for me, a good reference photo with a macbeth and a hdr with a macbeth is doing the trick most of the time.
                  still thinking to create a "home setup" for polarization, understanding the reflection and getting the right diffuse color is probably the key, i mean its nice to be physicial accurate but from time to time its also kind of limiting.

                  the blizzard material collection on the other side is a brilliant idea. maybe time to add a "material"-ball to the chrome and greyball. having a ball with a quarter of fabric, rubber, gold and wood would be a big help in a "non-ideal" situation, like a set.


                  -//edit:
                  for the macbeth, have look here for example:
                  http://www.rags-int-inc.com/PhotoTec...MacbethTarget/
                  there are many different values, but a this one seems to be pretty close.
                  tried myself 2 years back, but not sure about the result
                  https://www.dropbox.com/s/xvg7zjrukm...th_DR.psd?dl=0
                  Last edited by CrustedInk; 29-01-2016, 05:17 AM.

                  Comment


                  • #10
                    Here's my test of light matching anyway:

                    Click image for larger version

Name:	color_checker_vs_vray_light_meter.jpg
Views:	1
Size:	278.4 KB
ID:	859646

                    I shot the macbeth chart and the iphone app giving me a lux reading. I knew the distance from the table the chart was sitting on to the bulb that was shining on it so I could model a macbeth chart size box, put the macbeth chart texture on it and then make a vray light at the same distance above the scene. Now if I adjust my vray lights intensity until the vray light helper gives me the same lux value, I can make a vray physical camera with the same iso / aperture and shutter as my reference image and I get the above image as my raw render versus the reference image.

                    Here's The scene file, macbeth chart texture, render I got and the reference photo which shows you the macbeth chart and the iphone app:

                    https://copy.com/X37meDm8EenAkAnm

                    Comment


                    • #11
                      These look promising. Although the light shifts around a bit in my office - I can probably create a few HDRI's at different times of day and then dial in their lighting intensity based on the current lux of the room when the material samples are at my desk.
                      Brendan Coyle | www.brendancoyle.com

                      Comment


                      • #12
                        John, curious to know with the workflow you mentioned, were you able to match Vray physical camera's aperture, shutter, and ISO to corresponding settings of the shot camera exactly? What if you use a different camera to shoot the ref image? Even if set the same aperture, shutter, and ISO, I assume different camera will give you different results?

                        Also, how you handle the white balance between Vray physical camera to match the shot camera?

                        Sorry so many questions. It's interesting to see that you match it very well without shooting an HDR right there...
                        always curious...

                        Comment


                        • #13
                          So I'm picking up a macbeth chart and a little 16x16 photo tent with LEDs. Hoping I can use it as my control space when viewing/shooting material samples at my desk. I'll also shoot the tents interior to try and create an HDRI for use in Vray - and tweak to get matching light output between the physical and virtual. This will hopefully replicate having a GanzFeld'esque lighting space similar to what Lele talked about here, http://forums.chaosgroup.com/showthr...-testing-scene
                          Brendan Coyle | www.brendancoyle.com

                          Comment


                          • #14
                            Yeah, I've to have a chat to him about that. Ganzfield lighting is all very well and good but it won't give you a particularly useful test environment for things like speculars or reflections. I think his main part of it is possibly the diffuse portion of your setup though so that if you have a specific colour chosen in the diffuse, you get the same value back in your final render. That said and one thing I'm curious about, is if you've got a pure red as your diffuse but it's also got reflections on it, the energy conservation law will knock down the amount of red so it can't possibly return the same red value. Maybe it's more a case of making sure that the light hitting the surface of your material sample is a multiplier of 1 at that point so a purely diffuse material would return the same diffuse colour but reflective materials would just return a realistic amount of energy.

                            Good shout on your light tent though, something consistent and controllable is perfect. It'd be interesting to try and get LED's that can have their colour controlled so you could shoot normal maps too!

                            Comment


                            • #15
                              Actually fuckit, do you want to get some kind of little group around for this? Taking loads of measured environments would be quite a bit of work for one person but a lot less if a few did one or two each. I'd happily shoot hdri's with charts, take lux readings and put in some reference objects to try and get some good recreations of environments.

                              I figure we'd need to have some different test environments to show off different features of materials - for example a shader with very soft speculars / reflections wouldn't really read in ganzfield lighting or even an overcast day. It'd be good to have a few interiors with different size lights and intensities that'd give the shader more input to shine in.

                              Comment

                              Working...
                              X