Announcement

Collapse
No announcement yet.

Request- Film Response Curves (Comparing Octane to V-Ray)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    What would be introduced? Bercon? It's not supported yet.
    Or do you not having RT GPU working at all?

    Stan
    3LP Team

    Comment


    • #77
      Originally posted by grantwarwick View Post
      Is there any indication when V-Ray RT GPU will support bercon maps? Apart from that I've come to the sad realization that I am completely retarded for not testing it in over a year...It's unbelievable
      Sadly there is a severe bug with the Bercon maps, both in RT CPU and GPU. It really needs to be fixed quickly.

      http://forums.chaosgroup.com/showthr...pplied-to-bump
      https://www.behance.net/Oliver_Kossatz

      Comment


      • #78
        Originally posted by kosso_olli View Post
        Sadly there is a severe bug with the Bercon maps, both in RT CPU and GPU. It really needs to be fixed quickly.

        http://forums.chaosgroup.com/showthr...pplied-to-bump
        So it's not working in V-Ray RT? Bah! Every time I go to RT, it's damn bercon maps that sends me packing haha
        admin@masteringcgi.com.au

        ----------------------
        Mastering CGI
        CGSociety Folio
        CREAM Studios
        Mastering V-Ray Thread

        Comment


        • #79
          They were working fine before, it was the latest release that has broken them. Apparently there is a nightly build with a fix.
          https://www.behance.net/Oliver_Kossatz

          Comment


          • #80
            Originally posted by im.thatoneguy View Post
            I thought we were talking about the Red in the Ironman suit. You are correct.


            Your display device will be RGB which is fine but a capture device is analogous to a renderer and is a spectral capture device. A human's eye, a camera's sensor or film's emulsion is not sensitive to "RED", "GREEN" and "BLUE" they're all bell curves. When a camera sees red it also sees some green at the same time and even probably a little bit of blue. Many cameras see into the infrared a tiny bit as well. So if you have a texture that is "RED" it could be near infrared in the real world or it could be a very pure orange just outside of the sensitivity of the green receptor. If your light's spectrum has an IR spike like a tungsten bulb the IR reflective red texture would be far brighter than a quasi orange texture which are both expressed as "Red". Similarly "White" light could be a computer monitor producing white light or it could be a full spectrum source like the sun. The same material in the real world will look very different under those two different kinds of "white". "Accurately" capturing the real world is pretty subjective. Everybody is trying to some degree emulate the average human eye's spectral sensitivity but at the same time your brain is mucking with everything you see.

            [ATTACH=CONFIG]25478[/ATTACH][ATTACH=CONFIG]25479[/ATTACH]

            So when you say "A gray render won't be affected by gamma" you're right. But the world is almost never ever "gray" since "gray" could mean tungsten "white" or it could mean daylight "white" or it could mean IR "white" or it could mean "RGB White". Since no sensor (chemical, analog or digital) is a nice neat Red, Green and Blue sensor every imaging device is to some degree subjective. So renderers might be consistent but they're consistently playing by made up rules. That doesn't make them "right" that just means they're following the same arbitrary fakery. How a camera sees the UV haze in a sky as "blue" will vary from camera to camera but renderers ignore it completely. There is no UV filtration setting on a Vray Physical Camera. And that even is completely ignoring the even more exotic traits like polarity that light can have! I doubt we'll ever simulate polarity in a renderer. That's also why none of these film emulations will ever be effective without spectral considerations. You can't just balance red green and blue curves you have to have the spectral sensitivity of the red green and blue emulsions, you have to capture that infrared contamination in the red and that ultraviolet contamination in the blue. You also in an RGB LUT are unable to capture the metamerism of a film stock whose "yellow" might falloff very quickly to green while another film stock's yellow might quickly fall off to red. We know how the eye reacts to yellow and we can stimulate the eye really well as a standardized delivery Rec2020 colorspace RGB value but there is a lot of complex optical interactions before an image is captured. Nobody is neutral, you can just have the same biased perspective as every other renderer.

            Essentially what you're saying is that "Hobbits have hairy feet". Your statement is correct, hobbits do have hairy feet. But they also are a convenient fiction.

            You're absolutely right, but does that mean LUTs should be baked into renders? I'm a firm believer that they shouldn't given the myriad other options you have for colour correcting in post. Ultimately film was abandoned in favour of digital because it was inferior (though that's not to say it doesn't have a pleasing look to some), so is it worth spending hundreds of man hours introducing a feature that burns in film responses rather than patching bugs and introducing (imo) more useful features?

            What I don't understand is why you'd want to burn the LUT into the image straight out of the render engine? What if someone says "I don't like the colours on that"? You're fucked and have to re-render it all over again. I don't get it.
            Check out my (rarely updated) blog @ http://macviz.blogspot.co.uk/

            www.robertslimbrick.com

            Cache nothing. Brute force everything.

            Comment


            • #81
              Originally posted by Macker View Post
              You're absolutely right, but does that mean LUTs should be baked into renders? I'm a firm believer that they shouldn't given the myriad other options you have for colour correcting in post. Ultimately film was abandoned in favour of digital because it was inferior (though that's not to say it doesn't have a pleasing look to some), so is it worth spending hundreds of man hours introducing a feature that burns in film responses rather than patching bugs and introducing (imo) more useful features?

              What I don't understand is why you'd want to burn the LUT into the image straight out of the render engine? What if someone says "I don't like the colours on that"? You're fucked and have to re-render it all over again. I don't get it.
              What if that same person asks you to change your geometry or move something, you're also fucked...To take creative freedom away because of "what if's" is counter intuitive to the creative process IMO. I can tell you for a fact, if I was still working in the print industry I'd be testing and using LUT's regardless of if they were baked in. The retouchers wouldn't care, they'd just do what they wanted to the final image anyway. I could also save out the original and baked version, then they could blend in the amount in post.
              When did CGI Artists become CGI technicians :S
              admin@masteringcgi.com.au

              ----------------------
              Mastering CGI
              CGSociety Folio
              CREAM Studios
              Mastering V-Ray Thread

              Comment


              • #82
                Originally posted by grantwarwick View Post
                To take creative freedom away because of "what if's" is counter intuitive to the creative process IMO.
                But my argument isn't against using LUT's - my point is we can ALREADY load them into the vfb, without having to bake them into the final image.
                Check out my (rarely updated) blog @ http://macviz.blogspot.co.uk/

                www.robertslimbrick.com

                Cache nothing. Brute force everything.

                Comment


                • #83
                  Originally posted by Macker View Post
                  But my argument isn't against using LUT's - my point is we can ALREADY load them into the vfb, without having to bake them into the final image.
                  Hahaha I totally misread what you posted, my bad.
                  admin@masteringcgi.com.au

                  ----------------------
                  Mastering CGI
                  CGSociety Folio
                  CREAM Studios
                  Mastering V-Ray Thread

                  Comment


                  • #84
                    Originally posted by kosso_olli View Post
                    They were working fine before, it was the latest release that has broken them. Apparently there is a nightly build with a fix.

                    Any chance you could email me a link to a 3dsmax 2014 build of that nightly if you have time? I'm not on nightlies :/
                    admin@masteringcgi.com.au

                    ----------------------
                    Mastering CGI
                    CGSociety Folio
                    CREAM Studios
                    Mastering V-Ray Thread

                    Comment


                    • #85
                      I do know a wee bit of both physics and visual perception, but i fail to see where the color theory above has any visual impact, Gavin: show me a control image, one with and one without invisible colors (Like IR contamination, or UV spectrum bits), on a standard RGB display device, then we can maybe confront what the variance is, mathematically, and with a control group of humans, study the perceptual difference and likeable-ness of the three.

                      For right now, in plain RGB, without UV-Ray (this year's april's fool, btw), the matching is (and has been for a while) good enough for technical professionals to rely on when moving towards the physical realm (or in and out of it, with the scanned materials).
                      There's a serious amount of help coming from Post, but that is also where ALL of the color conversion away from LWF takes place, and where a non-linear human or camera response curve can be countered efficiently to normal conditions first, and to the desired tonal range then.
                      Is it perfect a method? Of course not: it's much more discrete and limited, compared to Nature.
                      Is it a dependable assumption to make in most situations? Yes it is. Compared to Nature (there is quite the literature on the subject: rendering approaches weren't chosen on a whim.).

                      Photons do not change their energy based on your perception of them, rather, they have a precise amount of energy based on their wavelength, and it's the atomical structure of materials getting excited by the different wavelengths and emitting a photon back of just the right one which we call colour (and Albedo).
                      That is what a renderer should calculate, within the bounds of the visible spectrum (how many channels ought to represent a visible spectrum well enough for human consumption is quite debatable, outside of a few specific cases. We also dither quite a lot, perceptually.).
                      If a renderer had to (or pretended to.) cater for the variance in visual perception in humans, wow, that would make for fun times.

                      This wasn't a debate on color perception, anyways.
                      This was a debate which was set on a course driven by two running rumors:
                      A) octane is doing it better in terms of light transport models and physical accuracy, and
                      B) VRay will get close enough if we add those "corrective" camera response curves to make it behave like Octane.

                      I rebated with a set of Octane renders which aren't in LWF, or which are in the CM section, but then are clipped, in response to point A), and renders which ought to be white by the standards of not just VRay, but everyone else which isn't a niche product (rMan, Arnold, iRay, Corona), and instead show all sorts of (perhaps correct?) non-user initiated, nor user-changeable, skews and deviations from the norm (namely, the light intensity and colour) for point B)

                      All I care for is objective, and measurable data.
                      Even speculars can be measured against each other.
                      "Pretty" isn't something that can be measured, but the lobe to tail ratio and the intensities can, even on very complex scenes.
                      Even with the best match Grant could get, the specular models look quite different in a number of places, and matching two different models may well be impossible.
                      Once we establish that everyone else is doing the expectable, and what's been proven right enough to fool millions of perceptual machines in humans over many a year, we can still CHOOSE to go with the odd one doing things its way.
                      Just my maths and historical results will not change despite what anyone else chooses.
                      I was never in the debate from stopping the use of Octane, i entered it to set a few records straight.

                      We can take the nitty gritty of what's supposed to be in a render engine core these days to a private place and debate it to exhaustion, but i think i demonstrated the two points i wanted to make, in here, with simple enough, and reproducible enough, data.
                      I'd love to be treated in kind, rather than with hypoteticals, so the debate can stay on course.

                      And most of all, i still see no case for the camera curves being provided by ChaosGroup (it's a 1d lut in plain text format. vray has a LUT input already. Get converting, says I.) nor that those could or should be used with HDR renders (the curves are in the 0-1 range. so clipping may be a feature, not a bug, also in Octane.).
                      But everyone will get them anyways, very likely.
                      With good peace of the people asking for fewer controls throughout the VRay UI, and a streamlined workflow.
                      Baking colormappings in LDR images doesn't sound like much of a workflow to *me*, so no, *I* shall stay on record flatly stating they're not going to be of any help, but rather hindrance.
                      I guess time will prove me wrong once again: it wouldn't be the first time, it won't be the last.
                      Regardless, my renders will never be clipped to 1.0.

                      As for hat VRay needs to match competitors after all the testing, i'd say it would have to add a few useless loops and slow down a wee bit, lest those amazing articles on hundreds of hours of rendering per VFX frame turn out way shorter.
                      Jokes aside, Chaos is pursuing a few avenues which are quite interesting.
                      Nightly builds are provided to the beta group, under NDA.
                      Feel free to apply!

                      P.s.: i did a short test too. Can anyone tell me how to enable importance sampling for IBL lighting in Octane for Max? I don't seem to be able to have it work (there's no dome light object, and no option in the bitmap loader when the HDR is used as a background. so i get the results you see below. Manuals are quite opaque, and i'd have thought something like that would be quite visible without trawling forums...). In the renders below the background HDRI providing light is a render of a VRay sky, and it matches for position and intensity. Octane's simply missing the very intense and very small sun disk. Notice that also means a (potentially a LOT) faster convergence, due to the fewer high-intensity samples bouncing around, but until i get it to correctly sample the Bg I'll never know.
                      All renders achieved the same sampling level (1024RPP) and were rendered on a GTX 980.

                      Click image for larger version

Name:	octane_2m50s.jpg
Views:	1
Size:	501.5 KB
ID:	858054 Click image for larger version

Name:	RT_BF_4m45s.jpg
Views:	1
Size:	159.8 KB
ID:	858055 Click image for larger version

Name:	RT_LC_1m35s.png
Views:	1
Size:	416.0 KB
ID:	858056

                      P.P.S.: the tests i am doing are on gray shaders not because i haven't found colours yet, but because they make retrieving what is happening simpler to read (R,G, and B become V, 0-255 ranges are 0-1, so the maths simplifies.), and further shows precisely the lighting colour (times the intensity of the shader's diffuse, 0.75 in the renders above) when a surface is hit (ie. the picked color is the light's not the shader's.). If the basics are set right (ie. LWF) then i can move to more complex scene and shading models conversion. Doing it backwards would make the job of figuring things out very complex indeed.
                      Last edited by ^Lele^; 12-08-2015, 02:44 AM.
                      Lele
                      Trouble Stirrer in RnD @ Chaos
                      ----------------------
                      emanuele.lecchi@chaos.com

                      Disclaimer:
                      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                      Comment


                      • #86
                        I'm just wondering : When does anyone has asked to BAKE the LUT profile into the render?

                        I was thinking we where asking for a possibility to have a "list of presets".
                        Now, that in that "list of presents" there are some film response curves or your own home made curves or other you could buy on turbosquid, whatever, it's all the same, we just want the possibility to switch through them and use a LUT based on a present, and not having to load them by hand.

                        As I said in my previous message, I would still be rendering linear at save out linear. It's just for workflow purposes that having a list of preset would make it way easier.

                        For now, if you use any of the tools already present in the VFB (exposure, highlight burn, contrast, WB, hue/sat, curves, whatever) it all bake it in the image if you are not careful enough to uncheck them before saving (so you are already breaking LWF there... and I also demonstrate few message before that the curves are broken in the VFB.)
                        BUT, LUT are not baked into the saved image...

                        So why are we talking about baking any of that LUT info to the image and break LWF in the first place?

                        Now, having that data of film response curve converted to LUTs, well, it would be really helpfull if someone that has the skills to convert them could do it, I know I don't have them, and it seemed Grant didn't either, and that's just why I (at least, maybe Grant as well but I don't want to talk in anybody else name) would really appreciate if Vlado or any other person from CG could convert them to LUT.

                        Now, that anyone drop that data into the folder with all the presets that could be available for the VFB, that's his internal workflow pipeline and personal preference.

                        I just don't get why it's so unbelievable to have a list of presets for LUT...

                        Stan
                        Last edited by 3LP; 12-08-2015, 02:59 AM.
                        3LP Team

                        Comment


                        • #87
                          Originally posted by grantwarwick View Post
                          Come on Lele, try not to be bitter about it haha! You seem like you're on your last leg.
                          Do you consider yourself an artist or a technician? In the film industry I can see how matching to a filmed backplate with 100% accuracy so a compositor can do the color grading to an exact science is the ultimate goal but what about the guys who work with retouchers. We skew and warp the absolute crap out of everything, mostly in the hopes of making something more visually pleasing. It's absolutely instgramming things at a professional level.
                          If V-Ray wants to cater to the science crowd it can, but there's a massive market out there for those who simply don't give a shit about pure realism and want methods to make things look interesting and new. If having some adjusted color curves can help with that I don't see why it's not a beneficial addition to the software.
                          You write this so elegantly, but i still read it as "more chances for messy people to get more things more wrong.".
                          And it'd be all good, if the help requests wouldn't then be coming the ChaosGroup way, leaving them free to support real issues with the software, rather than with the user's setup or lack of foresight (see CLIPPING.).
                          Do what you please with your creations, and feel as you please about the job you do, but know it IS grounded in very solid and complicated maths, no different in nature than the one which made us sail seas or count the passing of time precisely: Optics and Trigonometry.
                          Ignoring or disliking the fact (and the fact that you DO do that type of maths intuitively all the time, by evolutionary design) won't make it go away, nor it will make shadows go a way they ought not to, or reflections appear where they shouldn't.

                          I think the beta groups could do with you (and Gavin) onboard, why haven't you two lazy a**es applied yet? ^^
                          You would LOVE some of the stuff that's being worked on (and under NDA, ofc.)
                          Not everything will work, which is the beauty of RnD, but the beta group is built and maintained to be very free in the debate and expression of one's mind, and boy do we not disagree amongst ourselves many times, to then find a synthesis...
                          Lele
                          Trouble Stirrer in RnD @ Chaos
                          ----------------------
                          emanuele.lecchi@chaos.com

                          Disclaimer:
                          The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                          Comment


                          • #88
                            Edit: I am on the beta group and love reading through, but it's for greater minds I feel haha.
                            I'm not on the nightlies though!

                            Last edited by grantwarwick; 12-08-2015, 03:16 AM.
                            admin@masteringcgi.com.au

                            ----------------------
                            Mastering CGI
                            CGSociety Folio
                            CREAM Studios
                            Mastering V-Ray Thread

                            Comment


                            • #89
                              Originally posted by 3LP View Post
                              I just don't get why it's so unbelievable to have a list of presets for LUT...
                              Stan
                              So let's assume you have a list of LUT presets and you choose one that you really like, then save out the image. You then need to apply the LUT in some other software, so how are you going to access the LUT's that are in the vfb presets list? Are they seperate files that are installed with vray? If so, why not just do a google search for film LUT's and download some free ones, or buy some?

                              What you're asking for is a set of free LUT's to be shipped with Vray really, rather than a presets panel?
                              Check out my (rarely updated) blog @ http://macviz.blogspot.co.uk/

                              www.robertslimbrick.com

                              Cache nothing. Brute force everything.

                              Comment


                              • #90
                                Originally posted by Macker View Post
                                So let's assume you have a list of LUT presets and you choose one that you really like, then save out the image. You then need to apply the LUT in some other software, so how are you going to access the LUT's that are in the vfb presets list? Are they seperate files that are installed with vray? If so, why not just do a google search for film LUT's and download some free ones, or buy some?

                                What you're asking for is a set of free LUT's to be shipped with Vray really, rather than a presets panel?
                                Ahaha I'd love to just download some free ones, and have but the ones that ship with Octane aren't available as LUTs ANYWHERE.
                                admin@masteringcgi.com.au

                                ----------------------
                                Mastering CGI
                                CGSociety Folio
                                CREAM Studios
                                Mastering V-Ray Thread

                                Comment

                                Working...
                                X