It seems z-depth is not calculated the same way in V-Ray (for Maya) vs Nuke.
V-Ray generated z depth below:
The Maya setup:
Problem: The same setup in Nuke as above would render a plane with one solid gray color tone.
Explanation from The Foundry's devs:
Now, my question – Would it be possible to implement a slightliy different z depth render element in V-Ray that would generate a z depth which would match a Nuke 3D comp?
Right now, we are unable to have geometry in Nuke's 3D space to line up with a z-sliced V-Ray render.
V-Ray generated z depth below:
The Maya setup:
Problem: The same setup in Nuke as above would render a plane with one solid gray color tone.
Explanation from The Foundry's devs:
ScanlineRender works as OpenGL and for a flat polygon respects the point of
view when calculating depth, as you've mentioned.
It's possible that V-Ray might have an option to change the Z-depth generation
which could help.
Another possibility could be to generate at shading time the Z-buffer
compatible with Nuke, in an AOV channel.
The following formula should do the job:
aov.z = 1 / P.z
Where P is the sample point in the camera coordinate.
view when calculating depth, as you've mentioned.
It's possible that V-Ray might have an option to change the Z-depth generation
which could help.
Another possibility could be to generate at shading time the Z-buffer
compatible with Nuke, in an AOV channel.
The following formula should do the job:
aov.z = 1 / P.z
Where P is the sample point in the camera coordinate.
Right now, we are unable to have geometry in Nuke's 3D space to line up with a z-sliced V-Ray render.
Comment