Announcement

Collapse
No announcement yet.

Strong VRAM increase over a few renderings

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Strong VRAM increase over a few renderings

    Hi savage309 and Peter.Chaushev ,

    I have a complex train interior and if I start a first rendering than the GPU usage at GPU1 is approx. 6700MB and at GPU2 5900MB (no NVlink enabled, GPU bucket mode). This values are quite good, I can render my scene without problems. But after the rendering is finished the VRAM doesn't goes back to the low values before rendering. OK, maybe some data is cached, but every new rendering cause an increase of the VRAM usage and after a few renderings the limit is reached and some times Rhino crashs.

    (Also interesting at the VRAM usage graph - there is an ugly peak at the end of one rendering (see screenshot GPU-Z GPU1). This I haven't seen at other renderings (next rendering at the graph is without peak). General if the GPU pass is started (after the LC) than the VRAM usage increase to a max value and doesn't increase anymore until the end. Maybe this kind of peak can be avoided.)

    My question is - could it be that we have a kind of memory leak? Could the VRAM usage kept constant so that I could render my scene several times without to restart Rhino? I hope a solution could be found, so that a 11GB card could be used without problems over the time.

    -Micha

    S ... Windows10 is running and no other software is used
    S+R ... Rhino is started and the scene loaded
    S+R+V ... V-Ray is rendering the final bucket GPU pass

    2x2080ti and NVlink disabled
    Last edited by Micha; 09-06-2019, 09:35 AM.
    www.simulacrum.de ... visualization for designer and architects

  • #2
    Today again - I tried to render some finale images of my current project and after three renderings the VRAM limit was reached and the Nvidia drivers crashed at 11GB. I feel quite lost - NVlink doesn't work and I don't know when it will work. And the standard multi GPU eat up the VRAM more than expected. Please help.
    www.simulacrum.de ... visualization for designer and architects

    Comment


    • #3
      Hi Micha

      Does this occur with specific project or any (even newly created ones) and is the behavior consistently reproducible?
      Does the same occur with all three options for the GPU Textures > Resize parameter?
      Are you using custom render settings?

      Comment


      • #4
        Hi georgi.georgiev

        I'm not sure about different scenes, since I'm working on one big project with VfR4 yet and this is from VfR2 (I think I reset the options some days before). But I reset the options again and start a rendering at low resolution - anything is fine, like expected. Than I changed some settings and it was fine too. If I increased the output size at 4000x2400 than the memory usage after the rendering is finished is quite high. But also this result is quite stable after few test renderings now. So, it looks like the continuous increase is gone. A lot of VRAM seems to be needed to hold all the frame buffer channels. There are seven additional channels added by the V-Ray denoiser or other internal tools.

        For now I have some pressure to get some projects done, but I will keep my eyes open for the issue.

        Thank you, Georgi.
        www.simulacrum.de ... visualization for designer and architects

        Comment


        • #5
          vlado georgi.georgiev Here one of my wild wishes - since the frame buffer needs so much VRAM to hold all the channels at high res, couldn't be possible to allocate an user specific graphic card for this use? For example I have 2x2080ti and 1x1080ti. For heavy scenes my goal is to use the 1080ti for display stuff and the 2080ti for rendering only. Now it could be great to set the 1080ti for the frame buffer and the 2x2080ti for rendering. So, it could render much larger scenes more independent from the frame buffer.

          www.simulacrum.de ... visualization for designer and architects

          Comment


          • #6
            Looks like I found a way. At the Nvidia control panel I set for Rhino the OpenGL usage to the 1080ti. So, it seems to be the VfR frame buffer is fixed to the 1080ti too. Now I have the both 2080ti for rendering only. (I need to fix other program like Photoshop to the 1080ti too.) Maybe an information about this kind of setup could be added to the V-Ray GPU help pages.
            www.simulacrum.de ... visualization for designer and architects

            Comment


            • #7
              The VFB does not use any noticeable amount of VRAM to display the rendered output. I assume the change in performance you've noticed stems from choosing to render only with your 2080ti devices, excluding the 1080ti (available memory is capped to the device with least amount of memory).
              As for the shared memory issue you have described, I understand our Support team is working on this. They should get back to you soon.

              If you wish to decrease the RAM usage when writing extremely large render output to disk, please take a look at the workflow described here: https://forums.chaosgroup.com/forum/...20#post1005520
              Note that in V-Ray Next for Rhino script access to the scene has been exposed so you can edit the scene directly.
              Make sure you've set the file output type in the Asset Editor to .EXR (.vrimg also works), then type in _editPythonScript in the Rhino command line and use the following script:

              import rhinoscriptsyntax as rs

              vray = rs.GetPlugInObject("V-Ray for Rhino")
              vray.SetSceneValue("/SettingsImageSampler", "type", 1)
              vray.SetSceneValue("/SettingsOutput", "img_rawFile", 1)
              vray.SetSceneValue("/SettingsOutput", "img_rawFileVFB", 0)
              vray.RefreshUI()


              The options of img_rawFileVFB are as following:
              0 - no memory VFB
              1 - full memory VFB
              2 - preview

              Kind regards,
              Peter
              Last edited by Peter.Chaushev; 20-06-2019, 07:27 AM.
              Peter Chaushev
              V-Ray for SketchUp | V-Ray for Rhino | Product Owner
              www.chaos.com

              Comment


              • #8
                Thank you for the suggestion of the rawfile. I will keep it in mind. For now it should work without, I suppose so we have an other problem.

                Both, the 2080ti and the 1080ti have 11GB of VRAM, so we should not get the memory capped. I observed that the VRAM usage at the non system GPU is quite low and stable. But the card, which is used by the system and Rhino, shows an increasing VRAM usage - see my first post table - GPU1 is the card for the monitor and Rhino. So, I rendered a few renderings only and than it crashed. It looks like a memory leak of Rhino. My workaround to use the 2080ti for rendering only and the 1080ti for Rhino helps to have more VRAM for the memory leak. I think, best I send you my file, where I see the problem.
                www.simulacrum.de ... visualization for designer and architects

                Comment


                • #9
                  Tested two of my largest scenes today and I don't understand it - the memory usage is stable now. Maybe a PC restart or a Windows update fixed the issue.
                  www.simulacrum.de ... visualization for designer and architects

                  Comment

                  Working...
                  X