Announcement

Collapse
No announcement yet.

memory issue?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • memory issue?

    just trying the new rt gpu lightcache thingy. using cuda it gets all the way through calculating the lightcache, then as soon as rendering starts, my gpu driver crashes and restarts, with the screen going blank for a second or two.

    is this a case of too big a scene? my card is (iirc) a 680 with 4gb ram, and admittedly i just enabled this test on the scene im working on, which is quite heavy and in no way set up to use rt gpu. . id imagine a slightly more graceful behaviour in this case however( i.e print a warning and stop the render before it crashes)


    any tips? im just downloading the latest geforce driver to see if it helps, mine is 332.21

  • #2
    Does it work on a simpler scene? Not sure if it's a VRAM issue, but it could be - we don't handle this very gracefully right now.

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      well it worked if i hid all the geometry apart from a set of external walls, but then i saw the spotlight attenuation doesnt work, so cant use it for this job anyway.. so yes, but ive not got time to test thoroughly now.

      would be good to have a display of used/available gpu ram when rendering..

      also, any possibility in the future to cache to system ram, to get around gpu memory limitations? or would the transfer rate be so slow as to render rt gpu useless..?

      Comment


      • #4
        In the latest versions there is a RT Mem Usage Stats at the very bottom of V-Ray RT console window:
        Click image for larger version

Name:	dtHpHp9.jpg
Views:	1
Size:	140.6 KB
ID:	855113
        Svetlozar Draganov | Senior Manager 3D Support | contact us
        Chaos & Enscape & Cylindo are now one!

        Comment


        • #5
          just out of interest, waht is the best workflow for freeing up gpu memory for rt? i ask because despite having a 4gb gpu i can only render one wall object from my production scene. any more and it crashes out.

          i checked and gpu-z says im using 3.5gb gpu ram before i even start RT, presumably for max viewport duties.. this is despite having the wall object isolated in the viewport.

          should i submit as a backburner job? use DR? any ohter tricks? i guess having 2 cards is the way to do it really.

          Comment


          • #6
            Try exporting your scene to .vrscene and render it with Standalone CUDA (-rtengine=5) to skip the memory hungry viewport. Changing the preview in 3ds max to bounding boxes could help too if you don't like the exporting workflow.
            If it was that easy, it would have already been done

            Peter Matanov
            Chaos

            Comment

            Working...
            X