Announcement

Collapse
No announcement yet.

Bucket rendering with progressive refinement of noise threshold?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Bucket rendering with progressive refinement of noise threshold?

    Is it easily possible? I got an image here which is kinda inefficient rendering distributed on multiple machines when using progressive sampling even with higher ray bundle size etc manifesting in lower CPU loads on the slave machines (<80%). I'm talking about a preview rendering here, not a final one.

    What would be cool is if I could say render bucket distributed, but starting with noise threshold of say 0.1 and then after each "completed" rendering at that threshold, render again with say 0.09. Maybe the previous samples could even be used so that the renderer doesn't have to recalculate everything, but say take the result of the 0.1 rendering and render it "further" to 0.09 so not too much cpu time is wasted in this process. Creme de la creme would be if when you save the VFB image in history it takes the last completed pass and saves it instead of the "half 0.1 and half 0.09"-image.

    What would you guys think about this?
    Software:
    Windows 7 Ultimate x64 SP1
    3ds Max 2016 SP4
    V-Ray Adv 3.60.04


    Hardware:
    Intel Core i7-4930K @ 3.40 GHz
    NVIDIA GeForce GTX 780 (4096MB RAM)
    64GB RAM


    DxDiag

  • #2
    I am wondering if that would be incredibly inefficient?
    http://www.jd3d.co.uk - Vray Mentor

    Comment


    • #3
      Originally posted by JD3D_CGI View Post
      I am wondering if that would be incredibly inefficient?
      Hence the not deleting the already gathered samples on the second pass and so on. In my theory it could be slightly less efficient but not THAT much. I'd use it for previews only, too.
      Software:
      Windows 7 Ultimate x64 SP1
      3ds Max 2016 SP4
      V-Ray Adv 3.60.04


      Hardware:
      Intel Core i7-4930K @ 3.40 GHz
      NVIDIA GeForce GTX 780 (4096MB RAM)
      64GB RAM


      DxDiag

      Comment


      • #4
        I have a feeling the first pass and each subsequent pass, for a noise level of 0.1, would change for a level of 0.09, so probably wouldn't want to keep them, even if it was for preview purposes, since it would likely be very inaccurate. Pretty much like the question, of keeping the lightcache after the first run, when using IPR, instead of recalculating it, each and every time you change something.

        Comment


        • #5
          Art48 I kinda wonder about that stuff too - when you use noise thresholds to define quality, there's a constant cycle of shoot rays, average the results and check does it meet noise thresholds. If it doesn't, repeat the process again, average the results and test again. Kinda like how vray is going to keep stepping up through levels of AA until it meets your expectations. It'd be kinda like it's dumping out the results of each cycle of testing even if it hasn't met the final standards. You'd have to find out what the memory overheads and speed hit would be for that offload / viewer update and see if it's worth it though.

          Comment


          • #6
            The BDPT worked like that, "progressive bucketing".
            In essence though is no different from progressive: that too gets done in mini-buckets, of sort, with spiral order.
            If you try and render with high min aa you'll see.
            They just lack the outline.
            Lele
            Trouble Stirrer in RnD @ Chaos
            ----------------------
            emanuele.lecchi@chaos.com

            Disclaimer:
            The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

            Comment


            • #7
              Ah I see, thanks for the info Lele! Makes sense. What still would be really useful is some kind of mechanism so I can save the last actually completed pass of the image rather than the current state. SO I have similar quality all over.
              Software:
              Windows 7 Ultimate x64 SP1
              3ds Max 2016 SP4
              V-Ray Adv 3.60.04


              Hardware:
              Intel Core i7-4930K @ 3.40 GHz
              NVIDIA GeForce GTX 780 (4096MB RAM)
              64GB RAM


              DxDiag

              Comment


              • #8
                Originally posted by Art48 View Post
                Ah I see, thanks for the info Lele! Makes sense. What still would be really useful is some kind of mechanism so I can save the last actually completed pass of the image rather than the current state. SO I have similar quality all over.
                So basically, you want progressive rendering, to store the last completed pass, so that if you decide to stop it, you don't get a partial pass stored in the image, only the last complete pass, example, it's done 300 passes and is in the middle of pass 301, you stop it, and want it to revert to the 300 pass image, before you save it.

                On the subject of progressive bucket rendering. You could do it if you had enough threads running, say a few hundred or perhaps a thousand, eventually you'd have so many buckets the entire image would be being processed all at once, but as LeLe, said, in essence that is what progressive rendering is, very very very small buckets (progressive micro bucket rendering)

                Comment


                • #9
                  Does the progressive render from corona work in the same way in relation to distributed rendering? The basis for Art's suggestion is how can the progressive render become more efficient when using DR. I don't bother with it as I find it far too inefficient and just use a single node for test shots, I usually have quite a few so can spread them across the farm (1 image per node) which is more effective.

                  Comment


                  • #10
                    I have no idea about Corona's DR, but if the wish is to save an incremental image,then there's resumable rendering.
                    While in itself it only has a save timer, one could conceivably use progressive to save out stuff at a given sample rate, and restore it for finishing later on.
                    Lele
                    Trouble Stirrer in RnD @ Chaos
                    ----------------------
                    emanuele.lecchi@chaos.com

                    Disclaimer:
                    The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                    Comment

                    Working...
                    X