I can totally deal with some of the differences that are here to stay, for instance the way shaders have to be approached slightly differently in order to achieve the same look. I'm having more trouble with things like the lack of glossy fresnel or 2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement). Starting a scene in RT to finish it in Adv. doesn't look like the way to go.
Announcement
Collapse
No announcement yet.
Have you tried V-Ray (RT) GPU ?
Collapse
X
-
Originally posted by BBB3 View PostWhat would be ideal for me is feature parity rather than full compatibility.
Comment
-
Originally posted by thanulee View Post
Thats what I meant Jez. I dont mind 1:1 if i start a project in RT i ll stay there. But i do mind, not being able to use same features i have in Adv
Jez
------------------------------------
3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)
Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
---- Updated 06/09/23 -------
Comment
-
Thanks guys for the feedback so far!
There are quite a few things that we were not realizing that are important for so many (for example, better displacement).
Just wanted to answer quickly to some of the questions.
V-Ray GPU supports instancing (for many years now). When using hybrid, don't forget that the CPU has to feed the GPUs and render. If you have many GPUs and CPU with not many cores, the CPU cores might be busy feeding the GPU.
For the next release we have added support for edges tex, vrscans and fog.
One question: Given your insistence that V-Ray GPU and V-Ray Adv. are two different renderers and that you don't intend for V-Ray GPU to converge towards Adv.
...
Are you saying this is the wrong way to think about it?
2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement).
If you can't fit the scene on the GPU you can use the on-demand mip-mapped textures (or resized textures in Active Shade mode). If this and any other common scene memory optimizations can't do it, you always can render the scene using the CPU as CUDA device (it is most likely way faster than offloading geometry to system memory).
Best,
Blago.V-Ray fan.
Looking busy around GPUs ...
RTX ON
Comment
-
Originally posted by savage309 View PostJust wanted to check how many of you have tried V-Ray (RT) GPU in V-Ray 3.5 or later?...
I will typically use all installed GPUs for rendering. If I need more interactivity, I will play with the settings a bit and if that doesn't work, I will designate one card for display only until final rendering.
I'm still using three old-school GTX580's. Only 3 GB each, but so far the majority of my work fits. The Fermi architecture was extremely computing-oriented and the render file does not have to hold your operating system and loaded applications.
A significant amount of my work is corporate consulting, where I teach digital artists how to use 3DS Max and V-Ray. I cannot tell you how much easier and faster teaching folks is with real-time rendering! It has changed everything and everything is faster and more efficient. You explain a rendering concept and the student watches it happen in real-time. Absolutely Priceless.
I'm always stoked when new features are added to RT/GPU!
-Alan
Comment
-
Originally posted by savage309 View PostFor the next release we have added support for edges tex, vrscans and fog.
Originally posted by savage309 View PostIf you can't fit the scene on the GPU you can use the on-demand mip-mapped textures (or resized textures in Active Shade mode). If this and any other common scene memory optimizations can't do it, you always can render the scene using the CPU as CUDA device (it is most likely way faster than offloading geometry to system memory).
Great to hear that glossy fresnel and 2D displacement are coming, esp if it turns out to be as fast and memory-efficient as on the CPU.
Check my blog
Comment
-
Originally posted by BBB3 View PostI can totally deal with some of the differences that are here to stay, for instance the way shaders have to be approached slightly differently in order to achieve the same look. I'm having more trouble with things like the lack of glossy fresnel or 2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement). Starting a scene in RT to finish it in Adv. doesn't look like the way to go.
And the other huge advantage of gpu rendering is the ability to scale up a render farm so much more easily and cheaply. And I think in 12 months (or hopefully even less) when the next generation of GTX cards come out they will probably have 24gb vram..... which will pretty much solve the memory issues for larger scenes.
Last edited by jironomo; 06-09-2017, 08:18 PM.
Comment
-
I never used Vray RT before because it lacks features found in VRay Advanced. Its a completely rewritten version of VRay. I wonder how it compares to redshift?
It would be cool if Vray Advanced could implement hybrid rendering. To have the GPU do stuff would be cool. I mean I have a GTX 1080 Ti just sitting here during CPU renders. Cant it do something?, like shoot rays or some shit. Or calculate something for the CPU?Last edited by stevejjd; 07-09-2017, 10:33 PM.
Comment
-
Originally posted by stevejjd View PostIt would be cool if Vray Advanced could implement hybrid rendering. To have the GPU do stuff would be cool. I mean I have a GTX 1080 Ti just sitting here during CPU renders. Cant it do something?, like shoot rays or some shit. Or calculate something for the CPU?
Comment
-
Originally posted by stevejjd View PostI wonder how it compares to redshift?
My earlier response is on the first page in this thread. I just wanted to share my experience with the current project I started this morning. It worked rather well with RT, until I used IPR (at least I think that's what's triggered the problem... launched from the Play button in the frame buffer). Then all of a sudden, the frame buffer was resized which isn't a problem when doing IPR, but I no longer can change the frame buffer size under "Common". It sits at 640 x 480, ignoring everything I put in it. Most of the settings under RT also disappeared and most strange of all: I no longer can go back to Vray Adv. as a renderer. It's gone from the list! It doesn't matter if I restart max and the computer... as soon as I open up that scene again, there is no way to set Vray Adv. as the renderer. I'm stuck with RT without settings and with no way to change frame buffer size. I solved it the first time by merging everything into a new scene, where I tried with Vray RT again (since I really want it to work). But after the same thing happened again, I can't really see any reason to use Vray RT. It's just too buggy and lack too many features, especially if I can't even revert back to Vray Adv.
Comment
-
Originally posted by Undersky View PostRedshift is how you write a GPU renderer. Vray RT is not =)
Like, is it that bug that you mention and the fact that it doesn't support everything from Adv and it doesn't match it looks? This is important, because no GPU raytracer is currently comparable with the feature set of V-Ray Adv, and V-Ray GPU from all GPU raytracers is by far the closest you can get.
And it has many features that RedShift doesn't, does not clamp all the shading, thus looks better and with 3.5 and later is much faster on many of the scenes ...
Best,
Blago.V-Ray fan.
Looking busy around GPUs ...
RTX ON
Comment
Comment