Thanks Tim. Appreciate you spent the time to explain the concept behind.
Announcement
Collapse
No announcement yet.
Ideal lighting conditions for a material testing scene.
Collapse
X
-
I´m just making the same thing, xD. The tricky parts to me it´s making sure the HDRI it´s made from pure lineal images and were to buy a mid grey ball for reference. I´m testing 2 options, one using a profile generated using the x-rite colorchecker software and light room disabling all the post effects, and i want to try this: http://mikeboers.com/blog/2013/11/07...aw-conversions
The other problem i have it´s dynamic range, i´m using 5D Mk II and a Sigma 8mm fisheye so i can´t use ND filters. Sometimes, f22 ISO100 V1/8000 it´s not enough to capture all the dynamic range (well it´s enough but not really accurate y usually want more).
I will make some final test once i get some kind of reference ball or material and share my thoughts here.My Spanish tutorial channel: https://www.youtube.com/adanmq
Instagram: https://www.instagram.com/3dcollective/
Comment
-
Originally posted by adanmq View PostI´m just making the same thing, xD. The tricky parts to me it´s making sure the HDRI it´s made from pure lineal images and were to buy a mid grey ball for reference. I´m testing 2 options, one using a profile generated using the x-rite colorchecker software and light room disabling all the post effects, and i want to try this: http://mikeboers.com/blog/2013/11/07...aw-conversions
The other problem i have it´s dynamic range, i´m using 5D Mk II and a Sigma 8mm fisheye so i can´t use ND filters. Sometimes, f22 ISO100 V1/8000 it´s not enough to capture all the dynamic range (well it´s enough but not really accurate y usually want more).
I will make some final test once i get some kind of reference ball or material and share my thoughts here.Brendan Coyle | www.brendancoyle.com
Comment
-
the key to a correct HDRI is purely shooting it with a gray reference, and ideally another two known points (a second gray reference for the visible part, and a known-intensity light source for the HD part),and working around that data to achieve a dome which will light a gray patch in LWF conditions to exactly the same intensity as the shot reference.
However, this type of approach introduce arbitrary (if perfectly captured) lighting into the look-dev phase, with potentially disastrous results (The case made by Seraph is one.).
Personally, I prepare my shaders ENTIRELY under GanzFeld lighting (Spherical IBL with 1.0F white color, in modern CG lingo), and my references are always shot with a colorChecker by the side, so the lighting gets neutralised.
Only after my albedo coefficients, and if i had the polarised refs, the different components of the shader, have been matched to the reference, i move the shader unto the model, into a light rig or ten.
This has two benefits, for me:
*) Lighting is neutral, the colors i pick in my VFB ARE the shader colors.
*) Any value on the test sphere exceeding 1.0f is a sure sign of energy creation by the shader, while any suspect darkening at the edges shows energy losses. Both need to be corrected, and ONLY under GanzFeld lighting can this be done both accurately and very directly (with a custom light rig, one would have to divide by the light's intensity and colour, and so on...).
This method has been widely tested in VFX and ArchViz, with predictable, consistent, and visually appealing results.
In some instance, the model and shaders were given to a different location, or studio, entirely, and even converted and rendered with another engine, and the results remained correct.
I used to keep these as my very own trade secrets, but apparently i'm not working in production for a while, so here, i spilt me beans.Lele
Trouble Stirrer in RnD @ Chaos
----------------------
emanuele.lecchi@chaos.com
Disclaimer:
The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.
- Likes 1
Comment
-
-
A screenshot of the UI, with the render under lighting conditions (the dome settings are shown on the right. rest is at default.)
While the render looks very uninteresting, that's the whole point of the exercise, as the render elements reveal (diffuse filter, reflection and specular, in order.):
Here is the render with an HDRI texture in the dome.
Lele
Trouble Stirrer in RnD @ Chaos
----------------------
emanuele.lecchi@chaos.com
Disclaimer:
The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.
Comment
-
Originally posted by 3LP View PostVery interesting Lele!
I'm not sure to get the whole process right, would it be possible to have a few screens of the process you describe.
Mainly for this part : "GanzFeld lighting (Spherical IBL with 1.0F white color, in modern CG lingo)"
Cheers
Stan
https://en.wikipedia.org/?title=Ganzfeld_experiment
A ganzfeld experiment (from the German for “entire field”) is a technique used in parapsychology which claims to be able to test individuals for extrasensory perception (ESP). The ganzfeld experiments are among the most recent in parapsychology for testing telepathy.[1]
Patrick Macdonald
Lighting TD : http://reformstudios.com Developer of "Mission Control", the spreadsheet editor for 3ds Max http://reformstudios.com/mission-control-for-3ds-max/
Comment
-
Eheh, GanzFeld simply means "total field" in German.
So it's used in a number of fields, psychology for one.
The substantive however, is "Lighting".
The difference is very simple: your L term in the shading pipeline is always 1.0, making shader evaluation an exact science, something which is hard to impossible to do with an arbitrary (if calibrated), and varying, HDRI dome.
Render elements do come to the aid, but they won't return exactly the same results as this method does (for instance, the specular term in the example above contains both the specular amount, and its glossiness, in one, across the whole object domain, and not restricted to lit areas.).
here's where i got the initial idea from (Ch. 9.1.2):
https://books.google.it/books?id=OW0...ffects&f=falseLast edited by ^Lele^; 22-06-2015, 07:44 AM.Lele
Trouble Stirrer in RnD @ Chaos
----------------------
emanuele.lecchi@chaos.com
Disclaimer:
The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.
Comment
-
Originally posted by 3LP View PostLooks great, but I'm not sure to understand how this is any different process than loading a HDRI in a VrayDome and do the shaders as usual.
Am I missing something?
Stan
Comment
-
Fair points, John.
However, some reference comes with precise color values and albedo values, however those were measured in the first place.
Then figuring out if the shader is outputting those without the 1/8th swiss in me becomes difficult allright.
In other words, it mimics something impossible to shoot in RL (you'll always have something suspending your object, and shadowing it), but it provides exceedingly accurate shader measurements.
When those are clear, the dropping of an HDRI into the dome (especially if calibrated and well known) hardly ever presents surprises, tbh.Lele
Trouble Stirrer in RnD @ Chaos
----------------------
emanuele.lecchi@chaos.com
Disclaimer:
The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.
Comment
-
Yep - I'd much rather measured stuff as a starting point - I'm kind of excited for the quixel megascans database, if not for the actual textures but more as a reference database of albedo and specular values for a range of different surfaces. I've got a monitor calibrator here that also works as a spectrophotometer so I can get fairly accurate rgb values for a surface. I suspect it doesn't work well on bumpy or curved surfaces though and I'm not sure how to estimate the reflectivity that I have to remove from the data but it's possible better than an educated guess!
Comment
-
Right, so it's basically go back to what we did 8 years ago, making a shader based on a white dome (ok it's spherical and not half dome), in stead of a HDRI.
This will prevent you to push reflection or whatever setting too far as a HDRI could miss lead you.
Is this a revolution or am I really missing something here?
Stan3LP Team
Comment
Comment