Hi, firstly apologies if I present this in a confusing way but I'm not very clued up to what i should be expecting with my renderfarm but this has to be something i've set up wrong and I was wondering if anyone else encountered it.
The problem is that when using distributed render I get a slower render using the local machine and my 2 rendernodes than if i just render on only the nodes. I've attached the scene and image which might help explain. But basically, I get 2m34s using 2 rendernodes only and bizarrely when i add my local machine to the render I get 2m42s.
My 2 rendernodes are identical and I7 5820k 3.3GHz cpu.
The local machine is I7 4930k 3.41GHz (this is also the fileserver)
all have 32 gig ram.
I'm rendering using vray 3.40.02 and the machines are connected via a gigabit unmanaged switch.
The scene is very basic, a 4k render, no textures and all the cpu threads are being used at 100% when rendering.
I'm guessing it must be the switch or someother network issue, but I cant see what and was wondering if anyone has any ideas.
I understand if the info is limited but and advice would be much appreciated.
thanks
The problem is that when using distributed render I get a slower render using the local machine and my 2 rendernodes than if i just render on only the nodes. I've attached the scene and image which might help explain. But basically, I get 2m34s using 2 rendernodes only and bizarrely when i add my local machine to the render I get 2m42s.
My 2 rendernodes are identical and I7 5820k 3.3GHz cpu.
The local machine is I7 4930k 3.41GHz (this is also the fileserver)
all have 32 gig ram.
I'm rendering using vray 3.40.02 and the machines are connected via a gigabit unmanaged switch.
The scene is very basic, a 4k render, no textures and all the cpu threads are being used at 100% when rendering.
I'm guessing it must be the switch or someother network issue, but I cant see what and was wondering if anyone has any ideas.
I understand if the info is limited but and advice would be much appreciated.
thanks
Comment