Hi,
We're investigating Deep compositing in Nuke.
We render EXRs with VRay Next, 3DSMax 2020.
Here are a few questions you may probably be of help with:
Is it wrong to think that deep fragments are computed for each Render sample ?
We hardly understand the deep.front and deep.back delta value for a solid surface: Is it the result of fragments merging when optimizing (VrayOptionRE), keeping the closest and farest values of the merged fragments ?
How come solid surfaces sometimes produces low alpha fragments, closest from Camera ?
We understand that merging fragments with similar ZDepth values can clean up these "almost transparents" fragments. Is it wright ?
Sadly, it seems that depending on merge threshold, these fragments are kept, following a "depth banding" pattern. Could this be avoided ? (these patterns can be seen on linked pictures representing the number of fragments per pixel. The number at the end of nomenclature being the Merge threshold).
Into Nuke, these "extra" samples are OK for simple DeepMerge... But the resulting depth channel computed from DeepToImage is corrupted, and many other usages of DEEP information are getting wrong because of that.
What strategy would you recommend ?
Thank for support
We're investigating Deep compositing in Nuke.
We render EXRs with VRay Next, 3DSMax 2020.
Here are a few questions you may probably be of help with:
Is it wrong to think that deep fragments are computed for each Render sample ?
We hardly understand the deep.front and deep.back delta value for a solid surface: Is it the result of fragments merging when optimizing (VrayOptionRE), keeping the closest and farest values of the merged fragments ?
How come solid surfaces sometimes produces low alpha fragments, closest from Camera ?
We understand that merging fragments with similar ZDepth values can clean up these "almost transparents" fragments. Is it wright ?
Sadly, it seems that depending on merge threshold, these fragments are kept, following a "depth banding" pattern. Could this be avoided ? (these patterns can be seen on linked pictures representing the number of fragments per pixel. The number at the end of nomenclature being the Merge threshold).
Into Nuke, these "extra" samples are OK for simple DeepMerge... But the resulting depth channel computed from DeepToImage is corrupted, and many other usages of DEEP information are getting wrong because of that.
What strategy would you recommend ?
Thank for support
Comment