I've noticed a weird behavior on the light cache calc that it will often take significantly longer to do a calc on a lower resolution image than a higher one. For example, on a very simple test scene - just a plane, teapot and sun - the light cache will take maybe a second or two to complete at 1920x1440. If I drop the resolution to say 640x480 it takes about 14-15 seconds. It will stay this slow as I raise the resolution up to 1733x1300 and then jump all the way back down to the 1-2 seconds (meaning it's not linear but seems to be hitting some threshold value). Light cache was at defaults with subdivs at 1500. If I raise the subdivs the threshold resolution goes up as well: e.g., at subdivs 2000 the 1920x1440 LC takes ~30 seconds but as soon as I hit 2310x1733 it just takes a few seconds again. In this example the entire 1920x1440 image took 40 seconds to complete, and the 2310x1733 image took just 17 seconds - so the higher resolution shot finished before the lower one even finished the light cache!
Is this expected behavior? Anything setting I can change to compensate?
BTW-This is on max 2016 w/ vray 3.40.02, 64gb ram, windows 10, Xeon E5 72 cores.
Is this expected behavior? Anything setting I can change to compensate?
BTW-This is on max 2016 w/ vray 3.40.02, 64gb ram, windows 10, Xeon E5 72 cores.
Comment