Announcement

Collapse
No announcement yet.

High RAM usage - tips please

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by squintnic View Post
    32gb isnt enough for commercial arch vis stuff. the easiest and cheapest way around this is to buy more
    it doesn't seem that long ago that we had to fit everything in less than 3gb...

    Comment


    • #17
      I use to set my swap file 4 X the amount of ram I have in the machine
      Kind Regards,
      Morne

      Comment


      • #18
        Originally posted by tricky View Post
        Does having not enough RAM have a more significant effect in progressive rendering?
        I don't think it will make a difference. regardless of what image sampler you're using, once you run out of memory the rendering will crawl to an end like that.

        Comment


        • #19
          Originally posted by squintnic View Post
          32gb isnt enough for commercial arch vis stuff. the easiest and cheapest way around this is to buy more
          32gb is still sufficient for pretty much any arch.vis job. i do find myself wishing for more occasionally these days, but the ram slots of both my machines are full.



          certainly the cheapest way around this is to actually optimise your scenes a little. takes some time, but is good practise generally, however much ram you have.

          *reduce the resolution of unnecessarily large textures.. people tend to slap 8k textures in a scene when a 1k texture would be visually indistinguishable.

          there are several applications, some free, that let you batch resize textures. you can also use tiled exr or tx mipmapped textures, but ive found this slows rendering unacceptably.

          * use proxies.

          *make sure you are instancing instead of copying. there is a script somewhere that will replace copies with instances, dont remember the name.

          * optimise your geometry. people get lazy and just slap purchased models in their scenes without considering polycount. pro-optimiser does a good, low effort job at this.

          it helps to have lower poly versions of trees for use in large forests etc.

          * set your displacement and fur settings to a reasonable level for the scene. adjust the density/edge length until it starts to look crap, then turn it up until it looks good..

          * lightcache can use a fair bit of ram.. not such a large proportion these days.. in the days of 2gb in a workstation it could be quite a large proportion of available ram.

          * use vrimg to save out if the images are super high res. and disable the full res vfb.

          * render via backburner as opposed to directly in max. that always used a couple of Gb less ram.


          for years i have specialised in rendering massive masterplans etc. i did scenes of 10 sq km with golf courses full of trees and hundreds of buildings in 2gb ram.

          my last major masterplan was a city with 80,000 trees and approx 2000 buildings, plus cars, planting etc. etc. i just squeezed it into 32 gb without having to down-res the tree models.. i could probably have cut ram usage in half if id bothered with that step.


          a more correct statement would have been "32 gb isnt enough if you dont want to optimise"


          - but then Bobby (glorybound) demosntrated recently, if you have moer ram you get more careless about being neat. a building with trees and planting eating 128gb..


          - you need to tell your artists, if thy want to render at full speed their jobs must fit in 16gb. for most scenes this is quite doable with a few hours of tidying. for those that are not, suffer the slowdown on your 16gb machines.

          this is slightly complicated by the fact that vray tends to use more ram if its available, so its harder to get a scene to fit in 16gb if you are on a 32 gb machine, than if you were working on a 16gb machine. my tips above, if followed should help.

          Comment


          • #20
            What I often find in scenes that uses way more Memory than one would expect.
            Trees. Some are simply completely over the Top. For example we recently got the Evermotion 163 Library. The largest Tree is 1,6GB without Maps. This is not even a large Tree. They have scanned trunks and tesselated leaves.Well it´s nice if you want a Macro from an ant climbing a Tree but it´s simply to much for an average archvis. But those trees can be optimized with ProOptimzer sometimes down to 10% vertex count without losing to much detail. Same goes for any other 3d Assets. Some have a Turbosmooth on rendertime applied and some have displacement applied through the material. Some have horrible materials in general that should be remade from scratch. (Without calling names, but expecially to the guys who think a physically SSS on a non volumetric leave object is a good idea)

            Working clean and accurate. The most important point. I have people here who constantly exceed the memory of our renderfarm while others don´t. Below the line it´s pure laziness excussed with "I have a tight deadline". This may sound harsh but that´s simply the experience I made. The cause of this is. Imported objects are not checked. Valid for all points I made above. Used textures expecially texure size is not checked. I recently see Refamo go completely red. when I checked the scene, wich was a pretty small one. Just a room, it turned out the guy was using some kind of floor tiling plugin that would generate 14k jpgs for each floor tile. Using master scenes. When we have larger projects we would usually make a masters scene that contains everything. Once you have a camera fixed to be worked on, refined and rendered you would delete absolutely everything you don´t see and make a seperate scene. Some people just don´t do it and wonder why they blow the farm. Make sure all 2D CAD data is deleted before starting to work on materials, lighting and previews. Sometimes they consume a lot of Memory and extra time when working on a scene. You can always merge them back from a previous scene in case they are needed. Modelling. I see a lot of people who build their models especially houses like Toy Blocks resulting in huge amounts of objects and polys. Sometimes thousends single non instanced objects for a simple house. Modelling facades with traditional poly modelling (build box, slice, inset, extrude, detach,snap clone instance,repeat as neccessary) will give you absolute clean and flexible models where a hole building sometime is just some hundred kb in size. 3dsmax is the absolute best software one can imagine to build facades and houses using box modelling. The people who think they are faster for example with Sketchup by building Toyblocks simply shall sit down and learn 3dsmax and what ply modelling really is. From my experience I can say that a standard archivz scene indoor, or outdoor in the size up to a commercial building including lots of assets, trees, grass, people should be easily renderable with 32gb of RAM if one is able to work clean. A good practice. Let people make realtime projects in the Unreal engine or Unity. There you have no other chance than working clean.

            One could argument. It uses to much time to check all my models and maps and all this stuff, so I just buy more Memory and am working a lot faster. Well, you don´t. Working clean gives you a lot of speed advantages. Working in the viewport is faster. Sending the rendering to the farm is a lot faster. Sometimes there are days where you do lighting, shading, preview rendering the whole day. It makes a big difference if sending a scene to the farm takes 1 minute or just some seconds. Also once you optimized a model, or you have a rezised version of a large texture you can always reuse it in the future. So after all, the time you spend in advance will pay off in the end.

            That beeing said, it doesn´t mean you don´t more Memory. More RAM of course is always better. Because sometimes you get horrible models from customers or you need to render 30k images. Sometimes the requirements just give you no other choice to make huge scenes.

            Comment


            • #21
              Originally posted by super gnu View Post
              32gb is still sufficient for pretty much any arch.vis job. i do find myself wishing for more occasionally these days, but the ram slots of both my machines are full.



              certainly the cheapest way around this is to actually optimise your scenes a little. takes some time, but is good practise generally, however much ram you have.

              *reduce the resolution of unnecessarily large textures.. people tend to slap 8k textures in a scene when a 1k texture would be visually indistinguishable.

              there are several applications, some free, that let you batch resize textures. you can also use tiled exr or tx mipmapped textures, but ive found this slows rendering unacceptably.

              * use proxies.

              *make sure you are instancing instead of copying. there is a script somewhere that will replace copies with instances, dont remember the name.

              * optimise your geometry. people get lazy and just slap purchased models in their scenes without considering polycount. pro-optimiser does a good, low effort job at this.

              it helps to have lower poly versions of trees for use in large forests etc.

              * set your displacement and fur settings to a reasonable level for the scene. adjust the density/edge length until it starts to look crap, then turn it up until it looks good..

              * lightcache can use a fair bit of ram.. not such a large proportion these days.. in the days of 2gb in a workstation it could be quite a large proportion of available ram.

              * use vrimg to save out if the images are super high res. and disable the full res vfb.

              * render via backburner as opposed to directly in max. that always used a couple of Gb less ram.


              for years i have specialised in rendering massive masterplans etc. i did scenes of 10 sq km with golf courses full of trees and hundreds of buildings in 2gb ram.

              my last major masterplan was a city with 80,000 trees and approx 2000 buildings, plus cars, planting etc. etc. i just squeezed it into 32 gb without having to down-res the tree models.. i could probably have cut ram usage in half if id bothered with that step.


              a more correct statement would have been "32 gb isnt enough if you dont want to optimise"


              - but then Bobby (glorybound) demosntrated recently, if you have moer ram you get more careless about being neat. a building with trees and planting eating 128gb..


              - you need to tell your artists, if thy want to render at full speed their jobs must fit in 16gb. for most scenes this is quite doable with a few hours of tidying. for those that are not, suffer the slowdown on your 16gb machines.

              this is slightly complicated by the fact that vray tends to use more ram if its available, so its harder to get a scene to fit in 16gb if you are on a 32 gb machine, than if you were working on a 16gb machine. my tips above, if followed should help.
              Some really helpful points there. Thankyou for taking the time.
              Kind Regards,
              Richard Birket
              ----------------------------------->
              http://www.blinkimage.com

              ----------------------------------->

              Comment


              • #22
                Originally posted by tricky View Post
                A scene I have rendering right now is using 28.5GB on the local machine. It has been sent to all our render famr machines (some of which have only 16GB). This particular scene is rendering at 100% CPU load on every machine, so this suggests the 3dsmax session does not necessarily crash as stated by Dmitry.
                Are you sure about that? its most likely impossible to have 28 GB scene in 16 GB of ram. You are probably seeing some max overhead.

                Regardless, if you try to fit a scene into ram which is not enough it will not render. Best bet increase your ram in the machines ( I know you said some can't have more then 16) consider depreciating them
                Dmitry Vinnik
                Silhouette Images Inc.
                ShowReel:
                https://www.youtube.com/watch?v=qxSJlvSwAhA
                https://www.linkedin.com/in/dmitry-v...-identity-name

                Comment

                Working...
                X