Announcement

Collapse
No announcement yet.

My first LWF tests :)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • My first LWF tests :)

    hi,

    my first test Linearworkflow:

    Render from VRay


    Postprod. in Photoshop


    Any questions, please ask

    thanks!
    3d architectural visuals:

    www.PandM-studio.com

  • #2
    Your raw render looks "double gamma'd". You'd be better off leaving the raw render linear until final output from photoshop. (Use an adjustment layer or similar on the top most part of your layer stack).

    Your comp looks ok, but you have some serious edge artefacts going (chair), and the fake dof is affecting the floor where the right edge of your render meets the #extension" you put in. A tad too dark in the wooden panel for my tastes.
    Last edited by trixian; 11-03-2008, 08:52 AM.
    Signing out,
    Christian

    Comment


    • #3
      ummmm, that's quite a difference!

      isn't the point of working in linear space that you view your renders in sRGB space in Max / Vray, and when rendered continue to view your images in this sRGB space in Post applications, but the data is stored as Linear float?

      Comment


      • #4
        If youre using photoshop that much, theres no point in using linear workflow - it'll make very little difference to your final output if youre willing to alter it to that extent.

        I dont really see the fuss with LWF anyway, but we'll leave that.

        Comment


        • #5
          I like the first one but the materials seem a little washed out. Is the input bitmap set to 2.2 ? How about an image in between the two.

          Comment


          • #6
            Yes - it looks double corrected. You shouldn't have to do much post at all if any. What you should see in the viewport should come close to what the final image should be. Are you using the Vray Frame Buffer or did you use the Max Frame Buffer?

            Isn't the point of working in linear space that you view your renders in sRGB space in Max / Vray, and when rendered continue to view your images in this sRGB space in Post applications, but the data is stored as Linear float?
            No, but that is one aspect of what can be done afterwards.

            The main point of LWF is that everything is calibrated between programs:
            1) Photos and Jpegs all live in the same gamma space as everything else you're working on due to their original settings.
            2) Max swatches/colors all live in the same gamma space.
            3) Monitors all live in the same calibrated gamma space.

            By having all three of these items 'line-up' or match, this is called being linearized or 'leveled' out. It's one straight line - there shouldn't be any curves if you were to plot it out. Therefore by having all three points above match, your final input and output between different programs and bitmaps should all be consistent across everything.

            For instance, if you use Max standard/default settings and render out with Vray - the diffuse swatch colors that are rendered hardly every match with what you expected. You ended up having to go back into the material settings and tweak everything until you get it right. Since you'll never get it right, it'll involve a lot of post production work and even then it wont be accurate. This is due to the way Max itself works - by default it doesn't work in Linear space (one that matches your OS or most paint programs.)

            Essentially, Max default settings are messed up.

            Once you have your final image in Linear format, you can safely scale offset and exposure within a post-production tool by saving them as .exr or .hdr. But this step can be done with any image output by Max - but those wont necessarily be linear if not done properly. With proper linear setup, the final colors should be truer to what was originally intended.
            LunarStudio Architectural Renderings
            HDRSource HDR & sIBL Libraries
            Lunarlog - LunarStudio and HDRSource Blog

            Comment


            • #7
              Linear work flow does not actually dictate float, it just so happens to be a good fit for the more advanced compositing jobs.
              The important part of LWF (or let's call it "gamma compensation" instead) is that you handle light ranges and hue\saturation a lot more predictably and spend less time fighting strange shades of light\shadow you had not anticipated. In addition to this, doing post work on a gamma corrected image (non-linear) produces errors, albeit small, but they add up. Why have artefacts when you can avoid them with the same amount of work.
              Another point is that if you have calibrated your monitor to a set standard, and adjust your "gamma compensation" for this, it is a lot easier for a co-worker to later on pick up your file and continue working on it, if you are away or on some other project. (As long as you both are aware of the "gamma compensation").

              Edit: Ah, someone got in before me.......damn my slow typing.

              Another thing I might point out: Even with 8-bit tga's or similar, you actually have a lot more room to tweak and comp if the images are linear. Trying the same with gamma corrected images gives you visible banding and other artefacts after only slight changes.
              Last edited by trixian; 11-03-2008, 10:09 AM.
              Signing out,
              Christian

              Comment


              • #8
                Originally posted by jujubee View Post

                No, but that is one aspect of what can be done afterwards.

                The main point of LWF is that everything is calibrated between programs:
                1) Photos and Jpegs all live in the same gamma space as everything else you're working on due to their original settings.
                2) Max swatches/colors all live in the same gamma space.
                3) Monitors all live in the same calibrated gamma space.
                no, this is sRGB.

                Comment


                • #9
                  no, this is sRGB.
                  'Linear'ization involves making sure all your applications and bitmaps line up in the same Gamma space. 'Workflow' is a method to reproduce consistent results through a standard practice.

                  FYI, 2.2 gamma does not automatically mean 'linear workflow' like most people think - it's just a target number often used to get consistency. Your target numbers can be in theory all matched to 1.8 across all applications, bitmaps, and monitors and still be considered 'linearized' and a 'workflow' if you and your team all calibrate to those numbers.

                  I should add to SV's original question, that you are able to view the results if you choose to not 'bake-in' linearization using curves. You have two options/methods when using LWF - either to bake-in the results (as I tend to do - I don't do much post) or you can adjust the curves after-the-fact as Rob Niederhaus (spelling?) aka Throb originally showed in the very first LWF thread. By baking-in the results, you eliminate a few extra steps.
                  Last edited by jujubee; 11-03-2008, 11:29 AM.
                  LunarStudio Architectural Renderings
                  HDRSource HDR & sIBL Libraries
                  Lunarlog - LunarStudio and HDRSource Blog

                  Comment


                  • #10
                    Just to add this to the discussion:

                    for me, 'Linear-ising' an image, means viewing the colours in a perceptually linear space - i.e. correcting for the response curve your monitor applies to the data. When viewing an unclamped floating point image in the VFB, on the monitor it's initially very dark, but when applying an sRGB curve to the image, the data now appears perceptually linear. I.e. Mid greys should be about mid grey, you can see & store a lot of detail in shadow areas etc. It's just about understanding what your data is, and what you are doing when viewing it.

                    The whole thing about setting a standard between programs and hardware (monitors, digital cameras, printers, scanners) so that we are all dealing with essentially similar images, is known as sRGB colour space. Most 8-bit images are presumed to be stored in this space, so if you had an image without an embedded colour profile, you could assume sRGB and it should be pretty similar across platforms.

                    Comment


                    • #11
                      I think I'm more clear on what you are saying now. Yep, it makes sense to try to obtain 2.2 for sRGB (as sRGB is standard 2.2) except for when using an older CRT or printing. So I guess if you define sRGB standard as 2.2 gamma across devices than you are absolutely correct - I was just reading into it wrong.

                      As for meeting floating point, that's just an option and a benefit depending on how you export. Alot of times my images are fine straight out of max burning LWF in aside from a few minor hue/saturation adjustments in post. I usually export to .EXR just to be safe as well but sometimes don't really need to touch it.

                      Some of the better 3d guys such as Nichols, Throb, or Morbid probably have a ton of other uses for these linear formats in post - but I doubt they bake in their solution like I do.
                      LunarStudio Architectural Renderings
                      HDRSource HDR & sIBL Libraries
                      Lunarlog - LunarStudio and HDRSource Blog

                      Comment


                      • #12
                        hi guys!

                        thanks a lot for comments/ help.
                        Maybe when i`ll got time, try make better LWF, but i think now, this is very hard, and sometime not helpfull

                        And here not LWF, but only small Vray camera Dof test:




                        cheers
                        3d architectural visuals:

                        www.PandM-studio.com

                        Comment


                        • #13
                          Very nice, maybe it me but they look like minature apples - more like the size of large strawberries
                          Last edited by tom182; 22-03-2008, 11:15 AM.
                          Accept that some days you are the pigeon, and some days you are the statue.

                          Comment


                          • #14
                            gamma correction

                            Here's an extensive explanation by Master Zap as to "why is everything washed out" with gamma correction

                            http://forums.cgsociety.org/showpost...5&postcount=37

                            Comment


                            • #15
                              hi
                              tom- thanks, apple are small becouse this is special kind of apples
                              rmeija- thanks a lot for link, very nice
                              3d architectural visuals:

                              www.PandM-studio.com

                              Comment

                              Working...
                              X