Announcement

Collapse
No announcement yet.

denoiser render elements not saving

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • denoiser render elements not saving

    hi, im trying to save denoiser elements using the option to "only save elements" however if i choose "seperate render channels" it does not save those elements.. it saves all the others (multimatte etc...) , and i can see them in the dropdown in the vfb, so i know its rendering them. if i choose "vray raw image" it will save them in a multichannel exr. but then i dont get my vfb color corrections.

    this is a problem... i know i can batch process the exrs after in the vfb to apply the corrections the the exrs, but it takes 20+ seconds a frame to load and save each exr.. which on a 3000 frame render is really painful.


    on a related note vdenoise does not appear to be properly using GPU when using default denoiser. if i choose hardware acceleration, it takes 5 secs longer than if i choose cpu only... and the gpus in task manager sit at about 1% usage.. having said that cpu is also at 1% usage so im not sure where its processing the denoise?!

    cpu is 64 core 3990x, gpu 2x "antique" titanX maxwell 12Gb

  • #2
    Originally posted by super gnu View Post
    hi, im trying to save denoiser elements using the option to "only save elements" however if i choose "seperate render channels" it does not save those elements..
    That's expected. Post-render denoising is meant to work with multichannel image formats only.

    Originally posted by super gnu View Post
    if i choose "vray raw image" it will save them in a multichannel exr. but then i dont get my vfb color corrections.
    Is "Save VFB color corrections to RGB channel" active?

    Originally posted by super gnu View Post
    on a related note vdenoise does not appear to be properly using GPU when using default denoiser. if i choose hardware acceleration, it takes 5 secs longer than if i choose cpu only... and the gpus in task manager sit at about 1% usage.. having said that cpu is also at 1% usage so im not sure where its processing the denoise?!
    Hardware acceleration forces VRayDenoiser to use any compatible GPUs to assist in the denoising. Judging by my simple tests (using default denoiser), it speeds things up significantly. Otherwise, any errors in the log?
    Aleksandar Hadzhiev | chaos.com
    Chaos Support Representative | contact us

    Comment


    • #3
      thanks for the reply, i didnt know about the "save vfb corrections to rgb channel" tickbox.. very happy thats there.

      regarding the standalone denoiser, here is the log.. does seem to be using the gpu.. however utilisation is extremely low, unless the task manager is misreporting.

      its definitely slower than a 3990x, but since its quite an old card that might be possible?

      Click image for larger version

Name:	debug log.jpg
Views:	128
Size:	316.3 KB
ID:	1197772

      ​​​​​​​

      Comment


      • #4
        here is the gpu usage in task manager


        Click image for larger version

Name:	gpu usage.jpg
Views:	119
Size:	318.6 KB
ID:	1197774

        Comment


        • #5
          I think you might have to set that view to display CUDA or COMPUTE0 or something like that, but I don't see those options in my task manager at the moment. Maybe they only show up when using them? (little popup where it says "3d" or "Copy" or "Video Encode", etc.

          Comment


          • #6
            ahh yes that was it.

            i needed to show "compute_0"​

            Click image for larger version

Name:	gpu usage.jpg
Views:	118
Size:	289.7 KB
ID:	1197873

            still not amazing utilisation in the sense that while it does go to 100%, it drops to 0 every second or so, as it jumps from strip to strip in the image.

            i wonder if its possible to manually increase the "chunk size" that the gpu processes so it gets higher utilisation?


            in the end though, i think i probably just need a new gpu.

            or is circa 50 secs a frame normal for "strong" denoising 4k with blend frames set to 1? (blend set to to 2 get 80 secs a frame)

            i dont think i have other bottlenecks, the files are on a 10GB/sec ssd raid array.

            i also ask what is "normal" because, with 2250 frames to do, im considering sending the vdenoise app to my client, along with the frames, he has an rtx3090, so i assume it would be noticeably faster? i hope so.


            Last edited by super gnu; 16-12-2023, 04:17 AM.

            Comment


            • #7
              I noticed this same thing a few months ago; so I set up an older machine with a 2080Ti to be used pretty much just for denoising. I configured it in deadline to run four instances of the denoiser at the same time. Put a 10 Gig NIC in off eBay.

              With a couple scenes with a ton of elements (usually from light mix) I had to reduce that to three at once, but otherwise it has worked well. It’s not quite four times as fast, mind you, but it is much faster than a single instance.

              I then configured deadline to run it as a frame dependent job (+/-1 Frame for the frame blending) and it can run concurrently as the job is rendered.

              Our scenes tend to have a lot of layers are only run near the speed of the GPU on the latest hardware, which is of course being used to render. Originally I had tried to use CPU to denoise because we have so many machines without good GPUs. But this was so much slower, especially when trying to offload it to the older generation hardware to keep the fast machine’s rendering vray. That was why I set up some older GPUs. Then I found that one box with four instances could handle a lot of it. There are a few other older machines with things like 1080Ti and 980Ti etc that kick in if needed as well. The biggest time saver is running it as a frame dependent Job.

              Note our vray renders usually take a long time per frame. So there is time to do the denoising while the next frames are rendering. If it’s a scene that renders fast then the denoising may get behind, but then the nodes with big graphics cards can come in at the end and clear it up.
              Last edited by Joelaff; 16-12-2023, 10:46 AM.

              Comment


              • #8
                thanks for the tip..especially useful since i have 2 identical gpus so i can set each worker to have a different gpu affinity.

                edit: that doesnt seem to work. It appears to try to use the "best" gpu as the app does normally.. if a worker is set to a a different gpu using affinity, it renders painfully slowly and does not appear to use much cpu, or any gpu.

                Last edited by super gnu; 18-12-2023, 04:02 AM.

                Comment

                Working...
                X