Announcement

Collapse
No announcement yet.

Have you tried V-Ray (RT) GPU ?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I can totally deal with some of the differences that are here to stay, for instance the way shaders have to be approached slightly differently in order to achieve the same look. I'm having more trouble with things like the lack of glossy fresnel or 2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement). Starting a scene in RT to finish it in Adv. doesn't look like the way to go.
    Check my blog

    Comment


    • #32
      Originally posted by BBB3 View Post
      What would be ideal for me is feature parity rather than full compatibility.
      Thats what I meant Jez. I dont mind 1:1 if i start a project in RT i ll stay there. But i do mind, not being able to use same features i have in Adv
      www.yellimages.com

      Comment


      • #33
        Originally posted by thanulee View Post

        Thats what I meant Jez. I dont mind 1:1 if i start a project in RT i ll stay there. But i do mind, not being able to use same features i have in Adv
        From what I remember reading, because they are different render engines they will always be different and some things they simply cannot/will not be able to code in RT - I'm sure the good people at Chaos will chime in to correct me
        Jez

        ------------------------------------
        3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
        Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

        Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
        ---- Updated 06/09/23 -------

        Comment


        • #34
          Glossy Fresnel will be there, just so you know... 2D displacement is also a possibility.

          Best regards,
          Vlado
          I only act like I know everything, Rogers.

          Comment


          • #35
            Thanks guys for the feedback so far!
            There are quite a few things that we were not realizing that are important for so many (for example, better displacement).

            Just wanted to answer quickly to some of the questions.
            V-Ray GPU supports instancing (for many years now). When using hybrid, don't forget that the CPU has to feed the GPUs and render. If you have many GPUs and CPU with not many cores, the CPU cores might be busy feeding the GPU.
            For the next release we have added support for edges tex, vrscans and fog.


            One question: Given your insistence that V-Ray GPU and V-Ray Adv. are two different renderers and that you don't intend for V-Ray GPU to converge towards Adv.
            ...
            Are you saying this is the wrong way to think about it?
            Absolutely! With that being said, V-Ray Adv and V-Ray GPU will continue to share the same UI and mostly the same features for the time being. But they are different engines and the goal is not to have 1:1 results.

            2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement).
            Feature compatibility as much as possible is already happening, but it is unlikely to ever happen for all V-Ray Adv features. V-Ray GPU already has more features in that regard compared to every other GPU raytracer (interactive denoiser, rect light directionality, stochastic flakes, clipping, progressive adaptive image sampling, etc).

            If you can't fit the scene on the GPU you can use the on-demand mip-mapped textures (or resized textures in Active Shade mode). If this and any other common scene memory optimizations can't do it, you always can render the scene using the CPU as CUDA device (it is most likely way faster than offloading geometry to system memory).

            Best,
            Blago.
            V-Ray fan.
            Looking busy around GPUs ...
            RTX ON

            Comment


            • #36
              If instancing has been supported for years on max, can we have this thread on the maya site?
              e: info@adriandenne.com
              w: www.adriandenne.com

              Comment


              • #37
                Originally posted by francomanko View Post
                If instancing has been supported for years on max, can we have this thread on the maya site?
                It is supported in Maya as well.

                Best,
                Blago.
                V-Ray fan.
                Looking busy around GPUs ...
                RTX ON

                Comment


                • #38
                  .....(ignore)
                  Last edited by francomanko; 05-09-2017, 06:06 PM. Reason: didnt want to hyjack with maya stuff
                  e: info@adriandenne.com
                  w: www.adriandenne.com

                  Comment


                  • #39
                    Originally posted by savage309 View Post
                    Just wanted to check how many of you have tried V-Ray (RT) GPU in V-Ray 3.5 or later?...
                    I use it constantly. Especially since Hybrid Rendering has been introduced. Not only for previewing lighting, materials, composition, etc., but also for final production rendering. The only time I do not use it for final production is when the scene is too big for my GPU memory. And if the number of projects that needs more GPU memory than I currently have becomes commonplace here, I will simply buy more GPU cards with more memory.

                    I will typically use all installed GPUs for rendering. If I need more interactivity, I will play with the settings a bit and if that doesn't work, I will designate one card for display only until final rendering.

                    I'm still using three old-school GTX580's. Only 3 GB each, but so far the majority of my work fits. The Fermi architecture was extremely computing-oriented and the render file does not have to hold your operating system and loaded applications.

                    A significant amount of my work is corporate consulting, where I teach digital artists how to use 3DS Max and V-Ray. I cannot tell you how much easier and faster teaching folks is with real-time rendering! It has changed everything and everything is faster and more efficient. You explain a rendering concept and the student watches it happen in real-time. Absolutely Priceless.

                    I'm always stoked when new features are added to RT/GPU!

                    -Alan

                    Comment


                    • #40
                      Originally posted by savage309 View Post
                      For the next release we have added support for edges tex, vrscans and fog.
                      Fog and edge tex are great news!

                      Originally posted by savage309 View Post
                      If you can't fit the scene on the GPU you can use the on-demand mip-mapped textures (or resized textures in Active Shade mode). If this and any other common scene memory optimizations can't do it, you always can render the scene using the CPU as CUDA device (it is most likely way faster than offloading geometry to system memory).
                      So far, textures aren't a problem for me. Even my heaviest scenes rarely take more than 3GB for textures and 3-4GB for geometry. It is displacement that normally kills them.

                      Great to hear that glossy fresnel and 2D displacement are coming, esp if it turns out to be as fast and memory-efficient as on the CPU.

                      Check my blog

                      Comment


                      • #41
                        Originally posted by BBB3 View Post
                        I can totally deal with some of the differences that are here to stay, for instance the way shaders have to be approached slightly differently in order to achieve the same look. I'm having more trouble with things like the lack of glossy fresnel or 2D displacement that feel more like a step back from Adv and for which there is no obvious workaround. What would be ideal for me is feature parity rather than full compatibility. If that can be achieved, I can see myself using RT as my default renderer and get vastly faster images while occasionally switching to Adv when I start work on a scene I know won't fit on my card (which for me is anything that relies a lot on displacement). Starting a scene in RT to finish it in Adv. doesn't look like the way to go.
                        Yep this is exactly how we feel. Feature Parity is the most important thing for us. RT and Adv don't need to produce the exact same final image results but if RT can get all the same features as Adv then we'll definitely use RT because it is so much faster (for preview renders and production).

                        And the other huge advantage of gpu rendering is the ability to scale up a render farm so much more easily and cheaply. And I think in 12 months (or hopefully even less) when the next generation of GTX cards come out they will probably have 24gb vram..... which will pretty much solve the memory issues for larger scenes.

                        Last edited by jironomo; 06-09-2017, 08:18 PM.

                        Comment


                        • #42
                          I never used Vray RT before because it lacks features found in VRay Advanced. Its a completely rewritten version of VRay. I wonder how it compares to redshift?

                          It would be cool if Vray Advanced could implement hybrid rendering. To have the GPU do stuff would be cool. I mean I have a GTX 1080 Ti just sitting here during CPU renders. Cant it do something?, like shoot rays or some shit. Or calculate something for the CPU?
                          Last edited by stevejjd; 07-09-2017, 10:33 PM.

                          Comment


                          • #43
                            Originally posted by stevejjd View Post
                            It would be cool if Vray Advanced could implement hybrid rendering. To have the GPU do stuff would be cool. I mean I have a GTX 1080 Ti just sitting here during CPU renders. Cant it do something?, like shoot rays or some shit. Or calculate something for the CPU?
                            This would be great. I imagine its not possible yet due to RT limitations, but if GPU could take over just some part of rendering/scene preparations, everyone would get extra render power totally for free.

                            Comment


                            • #44
                              Originally posted by stevejjd View Post
                              I wonder how it compares to redshift?
                              Not even close. Redshift is how you write a GPU renderer. Vray RT is not =)

                              My earlier response is on the first page in this thread. I just wanted to share my experience with the current project I started this morning. It worked rather well with RT, until I used IPR (at least I think that's what's triggered the problem... launched from the Play button in the frame buffer). Then all of a sudden, the frame buffer was resized which isn't a problem when doing IPR, but I no longer can change the frame buffer size under "Common". It sits at 640 x 480, ignoring everything I put in it. Most of the settings under RT also disappeared and most strange of all: I no longer can go back to Vray Adv. as a renderer. It's gone from the list! It doesn't matter if I restart max and the computer... as soon as I open up that scene again, there is no way to set Vray Adv. as the renderer. I'm stuck with RT without settings and with no way to change frame buffer size. I solved it the first time by merging everything into a new scene, where I tried with Vray RT again (since I really want it to work). But after the same thing happened again, I can't really see any reason to use Vray RT. It's just too buggy and lack too many features, especially if I can't even revert back to Vray Adv.

                              Comment


                              • #45
                                Originally posted by Undersky View Post
                                Redshift is how you write a GPU renderer. Vray RT is not =)
                                Are the reasons for you to think that already mentioned in the threads (i.e. are all of your troubles already mentioned here, I want to be sure that we can gather as much feedback as possible).
                                Like, is it that bug that you mention and the fact that it doesn't support everything from Adv and it doesn't match it looks? This is important, because no GPU raytracer is currently comparable with the feature set of V-Ray Adv, and V-Ray GPU from all GPU raytracers is by far the closest you can get.
                                And it has many features that RedShift doesn't, does not clamp all the shading, thus looks better and with 3.5 and later is much faster on many of the scenes ...


                                Best,
                                Blago.
                                V-Ray fan.
                                Looking busy around GPUs ...
                                RTX ON

                                Comment

                                Working...
                                X