Announcement

Collapse
No announcement yet.

Exposure discrepency in VRay Camera

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    I'm having trouble using Lele's material editor scene file -- whenever I load it as my material sample scene, it works, but then eventually max crashes. After that, whenever I go back and reload the file, it crashes again as soon as I open the Material Editor. I tried creating my own simple test scene with a VRaySun and Dome Sky and loading that as the editor's scene, but it did the same thing to my scene.

    Anyone else have this trouble? I definitely agree that having the material editor reflect the intensity of the V-Ray light, if only as an approximation, would be very helpful.

    Thanks,
    Shaun
    ShaunDon

    Comment


    • #77
      Originally posted by studioDIM
      http://www.oxyshare.com/get/90384098044f881c5d45f2/test_03.rar.html

      New testscene for the mateditor.
      How do you download from that site? I cant find any buttons, just unclickable text.

      Comment


      • #78
        Shaun, the only case i had it crashing max was when i actually edited the test scene, saved it, and tried and update it in the material editor while there is at least one slot with the custom scene loaded.
        After that, if the mateditor slot in the scene had the testscene, i can confirm max would crash upon mateditor open.
        You can fix that through a quick script, or as i did, using my vrayswitcher, flipping between scanline and VRay changes the mateditor slots.

        It surely isn't as light a scene as the sphere or cube, though, so you might wish to use it as a sort of better preview, and switch it off once the material is set.
        You will notice though that there is no awkward geometry or modifier set in the scene.
        I am thinking it might be max's own issue...
        More feedback would be appreciated in any case.

        Cubiclegangstah : that crappy site hides downloads as blue text adverts. You either find it in the middle of the page, or at the very bottom.


        Lele

        Comment


        • #79
          Thanks Lele, I'll keep playing with it. I was just suspicious that max crashed just the same way when I created a new sample scene (just a sphere, a standard max camera, vraysun and a dome light with the vraysky) -- I hope max isn't allergic to having those lights in the mat editor. I'll post back if I make any progress on this.

          Thanks!
          Shaun
          ShaunDon

          Comment


          • #80
            I'm not able to stop my scene (or Lele's) from crashing. Even my uber-simplistic version which I've posted here brings max down when loaded into the editor. It works fine when I first change a material swatch to the custom scene. I can change as many of the materials in the editor over, but as soon as I re-open the scene and open the material editor, max crashes.

            I'm not sure what else to try. As long as I switch my material sample back to a standard shape (like Lele suggested) I'm fine, and I can get by with that for now. Just curious if anyone else is encountering this and if they've gotten around the crashes.

            Thanks!
            Shaun
            ShaunDon

            Comment


            • #81
              I just noticed something when Vlado compared his rendering settings to the photos. The first of Vlado's images had a very white sky compared to the actual photos. Is this to be expected?
              LunarStudio Architectural Renderings
              HDRSource HDR & sIBL Libraries
              Lunarlog - LunarStudio and HDRSource Blog

              Comment


              • #82
                Originally posted by jujubee
                I just noticed something when Vlado compared his rendering settings to the photos. The first of Vlado's images had a very white sky compared to the actual photos. Is this to be expected?
                It's white on the photos too!? Look carefully at the time of day for the photos. The first photo was taken at 19:20 and through the tree leaves at the top you can see that the sky is very white.

                Best regards,
                Vlado
                I only act like I know everything, Rogers.

                Comment


                • #83
                  Ah - thought you had referenced the 11:45 image.
                  LunarStudio Architectural Renderings
                  HDRSource HDR & sIBL Libraries
                  Lunarlog - LunarStudio and HDRSource Blog

                  Comment


                  • #84
                    Right.
                    If I got the gist of this correct, the consensus here is that when using vray sun&sky with a vraycam (and exposure), one should darken the colours in ones respective material slots. What I don't get is why. Diffuse is diffuse. It is a fake attribute created for cg that actually has no correlation whatsoever to real life (same goes for specular or ambient for that matter).
                    Why would we need to "calculate" the level of light in the scene when setting up our materials. This is completely illogical. A red plastic inflatable ball has the exact same physical properties if I put it in a drawer or out in the sun (it might look different, but perception is different from innate properties afaik).
                    Thus, vray "should" do this colour compensation for us, as it is very difficult to know what lighting schemes any one object I create will be subjected to here in different projects. I just can't see why we need to manually expose the colour and levels of a material in the material editor just because we are going to use non clamped lighting. If i define a diffuse colour as 250 gray (250,250,250), I am asking my 3d app. to interpret this diffuse value as if it was "white". Thus I adjust the way it is represented\percieved with the very tools vray supplies, namely: the vray camera with exposure, and the actual lighting.
                    Not sure if I made my self completely clear, but could anyone give me a good reason why this is not so?
                    Signing out,
                    Christian

                    Comment


                    • #85
                      as i understand it the textures i used are litten by daylight, so if u light them again in CG they ll be blown away...this is how i see it...but again i m often wrong
                      Nuno de Castro

                      www.ene-digital.com
                      nuno@ene-digital.com
                      00351 917593145

                      Comment


                      • #86
                        Well, the program is just doing what it does and varies (often with drastic results) depending on which color mapping methods and lighting you employ.

                        It would be nice if Vray was more automated to interpret colors and bitmaps depending on those color mapping and gamma factors. However, it seems like it would be a lot of work to program/account for each and every situation.

                        Having studied the mechanics behind LWF, I suspected Maxwell circumvented Max's internal trappings by having a complete standalone that wasn't confined to Max's own programming (linear correction was automatically employed.) Vray on the other hand, has a lot of reliance to Max's material editor.
                        LunarStudio Architectural Renderings
                        HDRSource HDR & sIBL Libraries
                        Lunarlog - LunarStudio and HDRSource Blog

                        Comment


                        • #87
                          Well I can understand this issue with regards to using texture maps made from photos. (Photos as textures used directly as diffuse kind of complicate matters too much). Those are pre-lit already with the lighting baked in, but my example and question was regarding completely synthesized shaders. Solid colours and procedurals.
                          The only way for this to be even slightly intuitive and logical, was if all colour swatches and shaders automatically were affected by the scene light one was using.
                          In addition, I can't see what difference using "8 bit" rgb colour selectors, as opposed to "float" colours from the VRayColor map has any bearing on this result, as I believe the "8bit" max colours one uses in the mat. editor get remapped internally by max anyway. Otherwise one would need a range setting for all colour swatches in max, so one could predefine at what level the colour one was adjusting corresponded to the lighting levels of the scene. (As in saying "the colours I am adjusting now are the colours I want to see when directly lit by a light multiplier of 1, etc).

                          As of now, all these colours relate to an "absolute scale" in values. Therefore they clearly are meant to represent the surface property and not the relative perceived colour from the end result (aka rendered image).
                          To reiterate: What is the point in having exposure in camera or as rendering controls (color map), if one is required to pre expose ones material settings in the first place?
                          Signing out,
                          Christian

                          Comment


                          • #88
                            Originally posted by trixian
                            Right.
                            If I got the gist of this correct, the consensus here is that when using vray sun&sky with a vraycam (and exposure), one should darken the colours in ones respective material slots. What I don't get is why. Diffuse is diffuse. It is a fake attribute created for cg that actually has no correlation whatsoever to real life (same goes for specular or ambient for that matter).
                            Why would we need to "calculate" the level of light in the scene when setting up our materials. This is completely illogical. A red plastic inflatable ball has the exact same physical properties if I put it in a drawer or out in the sun (it might look different, but perception is different from innate properties afaik).
                            Think in 255 levels.
                            A white (255)sphere, lit by one pointlight @multiplier 1, goes from 0 to 255 white.
                            Suppose you have the sphere reflecting, now.
                            What room would be left for reflections, where the sphere is already at 255 brightness?
                            So, already in LDR, you should think at useing some reciprocity between the diffuse and specular/reflection components of the surface.

                            It becomes a lot more important when using supercharged (ie. sun/vraylights/hdris) light sources.
                            Of course, vray would calculate the reflections in the white areas (for the case above) somewhat correctly (compatibly with the sampling settings), but you would still have a blown out area.
                            Expose and expose, and to let those super bright (say, 5x pure white, or 5.000 intensity) reflections show gradation, the camera would underexpose the image by a factor of 5 (say, from 200 to 40 ISOS).
                            Your 255 (or 1.0) white would then logically result a FIFTH in brightness.
                            The image would not be able to contain, in the visible 8bpc spectrum, the diffuse and the specular/reflection in a convincing way.

                            This is because REAL LIFE behaves like this.
                            Perceptual algorithms are what the past is made of: from paletting to shading algorithms were built with LIMITS in mind, not with pure physicality (which has always been well known. it's also always been well off the PC powers).
                            It requires one step more of inventiveness, and some unlearning, to get it right.
                            But personally i do prefer this way of working, as that's also how i look at my surroundings in reality.

                            Lele

                            Comment


                            • #89
                              Originally posted by studioDIM
                              (snip).....


                              Expose and expose, and to let those super bright (say, 5x pure white, or 5.000 intensity) reflections show gradation, the camera would underexpose the image by a factor of 5 (say, from 200 to 40 ISOS).
                              Your 255 (or 1.0) white would then logically result a FIFTH in brightness.
                              The image would not be able to contain, in the visible 8bpc spectrum, the diffuse and the specular/reflection in a convincing way.

                              This is because REAL LIFE behaves like this.
                              Perceptual algorithms are what the past is made of: from paletting to shading algorithms were built with LIMITS in mind, not with pure physicality (which has always been well known. it's also always been well off the PC powers).
                              It requires one step more of inventiveness, and some unlearning, to get it right.
                              But personally i do prefer this way of working, as that's also how i look at my surroundings in reality.

                              Lele
                              I get what you are trying to say, but I disagree.
                              In your example of a scene with 5x "normal" light intensity; if I so choose to expose my rendering through the camera's exposure, I would expect it to darken the "diffuse" to a fifth of its diffuse levels if (I expose it that way). Why would this not be convincing?....works fine for cameras and photos that are saved in an 8 bit format.
                              As for the first example, I can't see why the adding of reflections to the white ball wouldn't darken the diffuse (as it already does) to obey energy preservation issues.
                              I honestly think this is an interface \ work flow issue rather than a technical one, and obviously some of these issues come from the legacy max material code. Would this still be an issue if all colours and swatches were defined in a float format instead? Could this be similar to the issue of using 255 for a reflection value when using LC as secondary?
                              This last example is one I think relates to the ui\workflow part, as vray could easily just cap the calculated value internally, so we as artists would for example use the colour white to define the reflection amount as "maximum" This would then internally in vray be interpreted as maximum allowed in regards to energy conservation.
                              I might seem confusing using this last example, but I hope it illustrates my thought that this is purely about how we interface with the tool, not really how the tool or reality works.

                              Cheers.
                              Signing out,
                              Christian

                              Comment


                              • #90
                                I understand what you mean.
                                However, you will agree on the fact that an image with a scissor of 5.000 between the diffuse and specular is stretching it in 8 bit.
                                To accommodate for a stronger specular, you have to darken to below visibility something that instead should be well visible (white object...).
                                I'd be fine with a scissor of 5.0 if i had a mirror, where it's diffuse component would be pitch black or thereabout...
                                What could vray do, in that case, if not clamping, to avoid the issue?
                                How would it know what is "right" a brightness for a pixel?
                                255 white + 255 reflection is one thing.
                                What hits that object in term of lighting, and what that object reflects, and with what intensity, cannot be evaluated at material creation time.
                                What is the maximum "allowed"?
                                255- diffuse?
                                128?
                                Even 128 would be too strong in the debated case, where you'd still have your white diffuse coefficient on the sphere be 2.5 times less bright.
                                What i meant to point to is that if you want a graded reflection, from a reflection source with a brightness of 5, you should divide your "maximum" reflectivity by 5 to start with.
                                Use a 128 white sphere, with a reflectivity of 26 (or 0.1 float, whatever)(~128/5), and then start exposing.
                                You'll see the scene fall in your lap so very quickly, and accomodate for both strong diffuse lighting and strong reflection with extreme ease.
                                Even *cough* maxwell specifies in the manual that reciprocity has to be respected on the user's side.
                                They, the creators of the "light simulator" couldn't come up with anything better, be it in the plugins or studio?
                                Besides, let's move away from architecture, and it looks apparent how for different needs, the full range of 8bit colouring would come in handy, when GI/ physicality matters little, compared to render times.
                                In one word, it's nice to be able to exceed physicality when needed, or plain cheat.
                                I don't expect, however, that it has to be the raytracer making choices for me.
                                And ultimately, it's good what LOOKS good, regardless of the method used.
                                If the physcam/sun/sky method is difficult to get around to, just demultiply the sun and use a standard camera, and pronto, you can keep the original workflow, while using sun and sky.

                                Lele

                                Comment

                                Working...
                                X