If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Exciting News: Chaos acquires EvolveLAB = AI-Powered Design.
To learn more, please visit this page!
New! You can now log in to the forums with your chaos.com account as well as your forum account.
It blends the result of several different projection nodes based on the distance from the projected camera. Optionally, it can also take as input a z-depth image for each camera to take occlusion into account. If you are interested, I can post a sample scene.
This does sound useful. Its hard to imagine how it works without seeing it setup. We may experiment with the node but a sample scene and any thought on the practical purpose of this node would be good. We can imagine using this to project multiple views of the same setting onto the same geo, using different cameras. does that sound right?
This does sound useful. Its hard to imagine how it works without seeing it setup. We may experiment with the node but a sample scene and any thought on the practical purpose of this node would be good. We can imagine using this to project multiple views of the same setting onto the same geo, using different cameras. does that sound right?
I have used it :
- i have rendered a lot of spherical camera
- after, i have added all rendered images into this node, and projet all renders in my scene.
All projections were blended. Big camera mapping, fast render.
The scene has two layers, the master layer uses the VRayMultiProjection node to render the geometry, while the layer1 layer is set to produce the spherical projections (you need to render each of the three spherical cameras in turn).
The "zdepth offset" parameter can be used to deal with artifacts that sometimes appear when blending the different projections.
Comment