Hi,
I've been using vray 3.7 for ages now, and have just updated to vray 5. I thought this would be a really great upgrade, but have been rather disappointed with the CPU distributed rendering.
The issue manifests itself the same way as it did when I first started with *vray 3* - looks like it was actually vray 2 (see post link below). There was a workaround solution provided back then, but annoyingly this doesn't seem to help this time
I send a render using CPU distributed rendering (using vray 5 sp2) to my 5 render nodes. Once they have registered the job and loaded the scene, I get a vast array of buckets arriving from the render nodes. All good till now. However, slowly the buckets from the nodes disappear, until they seem to have all gone, then they all reappear. This is REALY inefficient as it means that they seem to have to wait till the slowest core on the slowest machine has rendered before they each receive a new set of data for a new bucket to render. My local machine renders as expected as once a bucket has rendered, the a new bucket is allocated to that core and it carries on.
Previously, the solution was to set a render time script to set the number of active local cores to 1 core (2 virtual cores) lower than the total available locally. This freed up a core to do the distribution allocation. This worked very well. This (the same script) does not work anymore
This is a link to the previous post (any years ago) where I had more or less the same problem: https://forums.chaosgroup.com/forum/...ses-now-slower
I hope someone can help with this as again, it's a bit disappointing from a new version of v-ray to be going slower.
Cheers,
Bill
I've been using vray 3.7 for ages now, and have just updated to vray 5. I thought this would be a really great upgrade, but have been rather disappointed with the CPU distributed rendering.
The issue manifests itself the same way as it did when I first started with *vray 3* - looks like it was actually vray 2 (see post link below). There was a workaround solution provided back then, but annoyingly this doesn't seem to help this time
I send a render using CPU distributed rendering (using vray 5 sp2) to my 5 render nodes. Once they have registered the job and loaded the scene, I get a vast array of buckets arriving from the render nodes. All good till now. However, slowly the buckets from the nodes disappear, until they seem to have all gone, then they all reappear. This is REALY inefficient as it means that they seem to have to wait till the slowest core on the slowest machine has rendered before they each receive a new set of data for a new bucket to render. My local machine renders as expected as once a bucket has rendered, the a new bucket is allocated to that core and it carries on.
Previously, the solution was to set a render time script to set the number of active local cores to 1 core (2 virtual cores) lower than the total available locally. This freed up a core to do the distribution allocation. This worked very well. This (the same script) does not work anymore
This is a link to the previous post (any years ago) where I had more or less the same problem: https://forums.chaosgroup.com/forum/...ses-now-slower
I hope someone can help with this as again, it's a bit disappointing from a new version of v-ray to be going slower.
Cheers,
Bill
Comment