So I've been working with Thinkbox support to try to get to the root of an issue when network rendering using Deadline. In scenes that contain a Phoenix sim, the servers are restarting 3dsmax.exe between every frame. Obviously this is slowing things down quite a bit (approx. 45 sec per frame). The restarting issue is on the Deadline side but the *reason* it's restarting appears to be changes that Phoenix is making to the scene in some way. I'm not sure if there's anything you guys can do or suggest but I figured it would be worth posting here because I can't be the only one having this problem.
Here's the relevant bit from Thinkbox support:
Here's the relevant bit from Thinkbox support:
I looked at the code, and talked to the Deadline developers. I have a suspicion, but we might have to do some tests.
When the Slave dequeues a job, it copies all necessary files to a local temp. folder. These include the current plugin (3dsmax) files from the Repository\plugins\3dsmax, as well as the job files that could include the MAX file (if SMTD is set to submit it with the job), auxiliary files, and even some asset files if they are included with the job, too.
Once the first task is completed, the Slave checks if another task of the same job is to be rendered. If that is the case, it checks to see if the content of the temp. folder still matches the content of the source folders where the plugin files came from. It is possible that one or more of the files copied locally have been modified while the previous task was running, in which case the temp folder has to be resynced, and the plugin has to be reloaded (what you experienced).
My suspicion is this: PhoenixFD uses a token system by default which lets it save its own external files (caches etc.) in sub-folders of the folder where the MAX scene is located. It is possible to set these paths to actual absolute paths on the network, but by default they are relative to the current MAX scene location. I wonder if a PhoenixFD object in the MAX file submitted with the job writes something in a sub-folder of the Deadline temp. folder on the Slave as a side effect of the scene being rendered? This could be confusing the folder content comparison performed before a new task, and cause the plugin to be restarted, but I will have to run some tests to see if that's really the case.
When the Slave dequeues a job, it copies all necessary files to a local temp. folder. These include the current plugin (3dsmax) files from the Repository\plugins\3dsmax, as well as the job files that could include the MAX file (if SMTD is set to submit it with the job), auxiliary files, and even some asset files if they are included with the job, too.
Once the first task is completed, the Slave checks if another task of the same job is to be rendered. If that is the case, it checks to see if the content of the temp. folder still matches the content of the source folders where the plugin files came from. It is possible that one or more of the files copied locally have been modified while the previous task was running, in which case the temp folder has to be resynced, and the plugin has to be reloaded (what you experienced).
My suspicion is this: PhoenixFD uses a token system by default which lets it save its own external files (caches etc.) in sub-folders of the folder where the MAX scene is located. It is possible to set these paths to actual absolute paths on the network, but by default they are relative to the current MAX scene location. I wonder if a PhoenixFD object in the MAX file submitted with the job writes something in a sub-folder of the Deadline temp. folder on the Slave as a side effect of the scene being rendered? This could be confusing the folder content comparison performed before a new task, and cause the plugin to be restarted, but I will have to run some tests to see if that's really the case.
Comment