This is a very new issue for me which i've been puzzling over all day.
I have some grass scattered with multiscatter, a very small patch, and I was getting a maximum cpu usage of 50% with it turned on. Problem is if it's in the scene and off camera out of shot, the entire scene has to be rendered at 50% of it's potential speed.
Doing some tests, it's only based on density -
At 1 object per every 5mm, I get a max of 50% cpu, same with any more dense, max is 50%
At 1 object per every 10mm the cpu usage goes between 70 and 80
Any higher and it starts to run at 100%
This has no bearing on the amount of objects scattered - I can scatter 1 object every 15mm over the equivalent of 1km square and it'll use 100% cpu, but if I do a tiny little patch 10cmx10cm at 1 obect per 5mm, I get no more than 50% cpu usage - even if it's totally out of shot and not being rendered!
Anyone had to deal with this before?
I'm going to do some tests changing the scale of my scene... but any other suggestions would be welcome.
I have some grass scattered with multiscatter, a very small patch, and I was getting a maximum cpu usage of 50% with it turned on. Problem is if it's in the scene and off camera out of shot, the entire scene has to be rendered at 50% of it's potential speed.
Doing some tests, it's only based on density -
At 1 object per every 5mm, I get a max of 50% cpu, same with any more dense, max is 50%
At 1 object per every 10mm the cpu usage goes between 70 and 80
Any higher and it starts to run at 100%
This has no bearing on the amount of objects scattered - I can scatter 1 object every 15mm over the equivalent of 1km square and it'll use 100% cpu, but if I do a tiny little patch 10cmx10cm at 1 obect per 5mm, I get no more than 50% cpu usage - even if it's totally out of shot and not being rendered!
Anyone had to deal with this before?
I'm going to do some tests changing the scale of my scene... but any other suggestions would be welcome.
Comment