I've been having a hard time recently trying to get certain files to render.
I am using a primary, old machine with an i7 Intel processor and 24GB of RAM, as well as three old render slaves with dual Xeon E5 processors with 16GB of RAM each.
I use the render slaves, when they work, via Distributed Rendering as Backburner won't zip the 3dsmax file as the various geometry and maps are larger than Backburner's 32-bit file size limit.
Some of the errors I am running into:
One of the files has a lot of foliage objects in Forest Pack Pro and uses up a large amount of memory. That simply is too much for the slave computers and they've never been able to join in. OK, so be it. So for that foliage file, I simply use my main computer to render, which it's able to do.
1) However, when I render a large resolution file, marking it to save as an .exr beforehand and using resumable rendering, when vray finishes the image, finishes the render elements, finishes the denoiser pass, and saves everything out, vray then instantly closes all of 3dsmax and dumps me out to a blank Windows desktop. Why is vray shutting down all of 3dsmax when it finishes saving? If I do not check to save the file beforehand and simply let the render finish by itself without saving, vray will not close 3dsmax.
2) The resumable .vrimg is really helpful, especially since I'm getting a lot of crashes during the rendering process. However, at one point during the night, the computer crashed, probably with a hard blue screen of death - I could not see as I had turned the monitors off and they would not turn back on to let me see the screen, but I was forced to do a cold reboot. After this, the .vrimg did not reload; it seemed to get corrupted. I could not get anything out of it either by loading it into the vray frame buffer nor by using the standalone vrimg to exr converter. If a vrimg gets killed mid-process like this, is there no way to get anything out of it that has already rendered?
3) When only using my primary computer, trying to render a 4,000 x 3,000 pixel image, I tried pre-calculating the light cache. I even turned off the vray frame buffer and set it to render directly to a vrimg only to save RAM. Vray started the process and hung at the Building Embree Static Accelerator stage. Am I simply too low on memory?
4) When rendering a slightly smaller file, I can use the slave nodes alone (without using the main computer, as it seems to be more stable without rendering on the host machine at the same time). However, some times the render slaves will get the distributed files, the vray messages window on my host machine will note in purple text that the slave machines are starting, but then one or more of the slave machines will die out/ crash 3dsmax, or will simply never be able to join, but that slave node will stay in a process of RAM swapping near 98%. Is there a message window that I can open on those slave machines to see what they are doing in realtime?
5) For distributed rendering, are files lighter and easier to render if brute/brute is used, brute/light cache (precalculated and saved by the host machine), or IR/LC?
6) Is there a way to see if the geometry or the maps are causing the most memory usage?
7) If none of my texture bitmaps (mostly jpgs, some tifs, nearly no exrs) are huge (all are likely 4K or smaller), would converting all of them to tiled tx files be very useful or not?
If I am not using any tiled tx or tiled exr bitmaps, can I turn the tiled image cache to something very low (like 1) but not zero (since zero is actually unlimited) and increase the available memory for my render?
9) Would converting large amounts of my geometry to vray proxies help much?
10) Any other tips to get these renderings to be more stable and work consistently?
Thanks,
Matt
I am using a primary, old machine with an i7 Intel processor and 24GB of RAM, as well as three old render slaves with dual Xeon E5 processors with 16GB of RAM each.
I use the render slaves, when they work, via Distributed Rendering as Backburner won't zip the 3dsmax file as the various geometry and maps are larger than Backburner's 32-bit file size limit.
Some of the errors I am running into:
One of the files has a lot of foliage objects in Forest Pack Pro and uses up a large amount of memory. That simply is too much for the slave computers and they've never been able to join in. OK, so be it. So for that foliage file, I simply use my main computer to render, which it's able to do.
1) However, when I render a large resolution file, marking it to save as an .exr beforehand and using resumable rendering, when vray finishes the image, finishes the render elements, finishes the denoiser pass, and saves everything out, vray then instantly closes all of 3dsmax and dumps me out to a blank Windows desktop. Why is vray shutting down all of 3dsmax when it finishes saving? If I do not check to save the file beforehand and simply let the render finish by itself without saving, vray will not close 3dsmax.
2) The resumable .vrimg is really helpful, especially since I'm getting a lot of crashes during the rendering process. However, at one point during the night, the computer crashed, probably with a hard blue screen of death - I could not see as I had turned the monitors off and they would not turn back on to let me see the screen, but I was forced to do a cold reboot. After this, the .vrimg did not reload; it seemed to get corrupted. I could not get anything out of it either by loading it into the vray frame buffer nor by using the standalone vrimg to exr converter. If a vrimg gets killed mid-process like this, is there no way to get anything out of it that has already rendered?
3) When only using my primary computer, trying to render a 4,000 x 3,000 pixel image, I tried pre-calculating the light cache. I even turned off the vray frame buffer and set it to render directly to a vrimg only to save RAM. Vray started the process and hung at the Building Embree Static Accelerator stage. Am I simply too low on memory?
4) When rendering a slightly smaller file, I can use the slave nodes alone (without using the main computer, as it seems to be more stable without rendering on the host machine at the same time). However, some times the render slaves will get the distributed files, the vray messages window on my host machine will note in purple text that the slave machines are starting, but then one or more of the slave machines will die out/ crash 3dsmax, or will simply never be able to join, but that slave node will stay in a process of RAM swapping near 98%. Is there a message window that I can open on those slave machines to see what they are doing in realtime?
5) For distributed rendering, are files lighter and easier to render if brute/brute is used, brute/light cache (precalculated and saved by the host machine), or IR/LC?
6) Is there a way to see if the geometry or the maps are causing the most memory usage?
7) If none of my texture bitmaps (mostly jpgs, some tifs, nearly no exrs) are huge (all are likely 4K or smaller), would converting all of them to tiled tx files be very useful or not?
If I am not using any tiled tx or tiled exr bitmaps, can I turn the tiled image cache to something very low (like 1) but not zero (since zero is actually unlimited) and increase the available memory for my render?
9) Would converting large amounts of my geometry to vray proxies help much?
10) Any other tips to get these renderings to be more stable and work consistently?
Thanks,
Matt
Comment