If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Exciting News: Chaos acquires EvolveLAB = AI-Powered Design.
To learn more, please visit this page!
New! You can now log in to the forums with your chaos.com account as well as your forum account.
Announcement
Collapse
No announcement yet.
Unreal-like depth fade parameter for opacity (v-ray gpu)
this is really interesting to have directly in 3d !
I use similar approach but in post ..if Vray allow as to use the zdpth pass inside the material tree that effect would be possible
it is like taking the z depth of the scene behind the plane and use it as mask on the plane itself .
Andrew used -kinda- similar approach here https://www.youtube.com/watch?v=FivFZhcQFi4
Exactly! as i wrote in the first message, fog (in material) transition work in global space (as i can see) but it doesn’t work if we apply texture. So zdepth in material slot - it is the solution, but how? maybe OSL shader?
why not use just regular proper volumes? is it rendering that slow? mix of fog and aerial perspective I guess if we are talking mountains. how fast does it need to be?
why not use just regular proper volumes? is it rendering that slow? mix of fog and aerial perspective I guess if we are talking mountains. how fast does it need to be?
If we talking about uniform dense fog without any wisps or «structures», so yep better combine volume fog and aerial perspective. But I often add in photoshop some wispy clouds, they are not physically correct but increase the artistic look on the renders. And if I can do it directly in 3d, it will be great.
Attached several examples. In the last work I tried to make dense, but slightly non-uniform fog with volume grid and noise textures, it is nice, but rendering too slow. https://www.artstation.com/artwork/VdXPDb
I was thinking VrayEnvironmentFog (no exp with grid). VolumeScatterMtl not yet ready on gpu but if remember correctly it should be faster for simple fog.
It is not trivial at all for a raytracer.
Rasterizers have Z before the final render render starts (so they can use it for a number of things), raytracers can get to it only as they trace.
It would need something like a separate render (either from behind the plane, or without it) to be able to repro the stuff on the plane correctly.
My memory of late is playing tricks on me, so take this with a pinch of salt, but i recall some render engine that was able to render a second scene viewpoint to then remap it unto f.e. a tv screen in the scene, which was visible from the main render's viewpoint.
I can't recall if this was a realtime shader (highly likely. Which makes the info useless, as we're back to square 1) or a raytracer.
Any help figuring out what that was would be appreciated.
If they're just planar cards, though, you're likely much better off doing it in Nuke (or PS, albeit a bit more laboriously) exporting your depth and fiddling with it.
You'll have to fake shadows and such, but at least in nuke that can be done (with the help of pPos and such AoVs).
Comment