We have workstations with 32GB RAM, but our farm has a mixture of machines, some with 16GB and some with 32GB.
Certain artists are sending scenes to render (using DR) and when I monitor the farm using I find that the CPU usage on the farm machines can be very sporadic, jumping between 0 and 15% activity. Simpler scenes (which need less RAM) seem to be far more efficient on the farm and use 100% CPU across the board. It's the more complicated scenes with trees and grass and stuff like that where the problems arise. Is this because RAM use is going beyond the limit of the PCs trying to render them?
I keep trying to explain to my artists that they need to watch how much RAM their scenes are using when rendering, but its not clear to me where the problems lie. Live projects become complex very quickly, and troubleshooting projects 'on-the-fly' is hard work with demanding clients and looming deadlines.
We use quite a few trees in many of our scenes: some are the evermotion ones and more recently we have also been using Laubwerk trees. They are all very good, but have a lot of geometry and (probably) overly complex materials applied to them. My team are also using ForestPro to scatter areas of grass (despite me often warning against that because of render times etc). They probably aren't overly concerned with the resolution of textures they are using in their materials either, so some could easily be 1000-4000 pixels I'd guess. The resolutions we are rendering are also job dependent, but they can be anything up to 8k with several render elements.
Can anyone give guidance as to what causes the main RAM use when rendering? Some people suggest making vray proxies out of the trees. Others say that ForestPro creates its own proxies that do the same thing. Some say that rendering to VRIMG file (particularly in high res renders) can help. Do different GI methods use more RAM with certain high poly complex scenes? Should we be telling grass/trees etc not to generate GI?
It would be great if there were some statistics somewhere that say something like:
"Right mate, you are using some really high poly trees here that are using 12GB just to start rendering. Why don't you optimise them a bit?"
"Mmm...did you know that you are using some bitmaps that are 4k in size. Try reducing those to 1200 pixels and it'll save a shed-load of RAM."
"You are rendering an 8k image with 6 render elements. If you render to a VRIMG file instead, you will save 9GB RAM"
"Those trees should really be in a Forest Pro object and that will save 6GB RAM"
"Your method of calculating GI will use 13GB RAM. Consider using LC/BF instead"
In essence, is there a way to find out what bits of a scene are causing the biggest spikes in RAM usage so I can target my optimisation efforts appropriately?
Certain artists are sending scenes to render (using DR) and when I monitor the farm using I find that the CPU usage on the farm machines can be very sporadic, jumping between 0 and 15% activity. Simpler scenes (which need less RAM) seem to be far more efficient on the farm and use 100% CPU across the board. It's the more complicated scenes with trees and grass and stuff like that where the problems arise. Is this because RAM use is going beyond the limit of the PCs trying to render them?
I keep trying to explain to my artists that they need to watch how much RAM their scenes are using when rendering, but its not clear to me where the problems lie. Live projects become complex very quickly, and troubleshooting projects 'on-the-fly' is hard work with demanding clients and looming deadlines.
We use quite a few trees in many of our scenes: some are the evermotion ones and more recently we have also been using Laubwerk trees. They are all very good, but have a lot of geometry and (probably) overly complex materials applied to them. My team are also using ForestPro to scatter areas of grass (despite me often warning against that because of render times etc). They probably aren't overly concerned with the resolution of textures they are using in their materials either, so some could easily be 1000-4000 pixels I'd guess. The resolutions we are rendering are also job dependent, but they can be anything up to 8k with several render elements.
Can anyone give guidance as to what causes the main RAM use when rendering? Some people suggest making vray proxies out of the trees. Others say that ForestPro creates its own proxies that do the same thing. Some say that rendering to VRIMG file (particularly in high res renders) can help. Do different GI methods use more RAM with certain high poly complex scenes? Should we be telling grass/trees etc not to generate GI?
It would be great if there were some statistics somewhere that say something like:
"Right mate, you are using some really high poly trees here that are using 12GB just to start rendering. Why don't you optimise them a bit?"
"Mmm...did you know that you are using some bitmaps that are 4k in size. Try reducing those to 1200 pixels and it'll save a shed-load of RAM."
"You are rendering an 8k image with 6 render elements. If you render to a VRIMG file instead, you will save 9GB RAM"
"Those trees should really be in a Forest Pro object and that will save 6GB RAM"
"Your method of calculating GI will use 13GB RAM. Consider using LC/BF instead"
In essence, is there a way to find out what bits of a scene are causing the biggest spikes in RAM usage so I can target my optimisation efforts appropriately?
Comment