Announcement

Collapse
No announcement yet.

Am I the only person that has never managed to get VRay GPU to work?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Am I the only person that has never managed to get VRay GPU to work?

    I feel like everyone is in on some kind of joke here.

    I don't think I've ever managed to get a render out of VRay GPU. All I ever get is a memory error saying "could not allocate device buffer *lots of numbers*" and then it fails.

    I assumed (perhaps wrongly) that VRay GPU could utilise system memory as well as GPU?
    Check out my (rarely updated) blog @ http://macviz.blogspot.co.uk/

    www.robertslimbrick.com

    Cache nothing. Brute force everything.

  • #2
    All I use is vray GPU these days and it works flawlessly. Mind you I have a 3090 so memory issues are less of a worry but when I do get to the memory issue part of things, turning on “on demand mippmapping” works well.

    if you run it in CUDA, you can check on “out of core memory” and it will pull from the ram. Haven’t needed to go that extreme but that option is there.

    Comment


    • #3
      Just tried "out of core memory" and it still didn't work.

      I come back to try this every 6 months or so, using a different project each time in the hope that it'll work. It fails every single time.

      The thing is, if I ever did get it working I can't imagine I'd trust it in a production environment.
      Check out my (rarely updated) blog @ http://macviz.blogspot.co.uk/

      www.robertslimbrick.com

      Cache nothing. Brute force everything.

      Comment


      • #4
        What graphics card do you have?

        Comment


        • #5
          Originally posted by Macker View Post
          All I ever get is a memory error saying "could not allocate device buffer *lots of numbers*" and then it fails.
          Hi Macker

          Like what spencerrayfitch said, you are running out of GPU memory. Your geometry or textures or both are heavy
          Try with on-demand maps or resizing to 512 size, it might be helpful in your case

          Our of Core Geometry is still work in progress, the GPU developers are working on that now. There is also Out of Core textures, which is on our roadmap

          On another note, there are many places using V-Ray GPU in production. 2 examples from the community here and here
          And talking about my own experience, I have spent 6 years in the high-end print industry as a visualizer and concept artist. I used V-Ray GPU at Renntech and Carlex, it helped massively

          If you have questions on the GPU workflow let me know.

          Best,
          Muhammed
          Muhammed Hamed
          V-Ray GPU product specialist


          chaos.com

          Comment


          • #6
            I've been using VRay GPU for about 4 years in production, stills, animation, VR, 360, VFX etc..and it's rock solid. It doesn't have the full functionality of ordinary VRay but, the time saved with rendering is incredible.

            I would suggest a GPU with 12GB as a minimum. If you're having out of core errors, turn on the Show Stats in the VFB to see where the vram is being used.

            Displacement, texture map size, volumetrics, mesh sizes are all to be watched and refined to keep the demand on ram down.

            Comment


            • #7
              I've been using Vray GPU a lot lately (mainly interiors and product visualization) and I've not had any major issues with it. Admittedly, I'm using an RTX 3090 with 24GB of VRAM. The error messages you're getting lead me to believe your scene assets don't fit in your GPU memory, as others have already noted. If that's the case, you would either have to optimize your scene (reduce poly count, reduce texture size, use more proxies etc.) or you would have to upgrade to a GPU with larger VRAM capacity. In VFB's "Stats" tab you can check how much memory your scenes are using when rendering.
              Max 2023.2.2 + Vray 6 Update 2.1 ( 6.20.06 )
              AMD Ryzen 7950X 16-core | 64GB DDR5 RAM 6400 Mbps | MSI GeForce RTX 3090 Suprim X 24GB (rendering) | GeForce GTX 1080 Ti FE 11GB (display) | GPU Driver 546.01 | NVMe SSD Samsung 980 Pro 1TB | Win 10 Pro x64 22H2

              Comment


              • #8
                Hi Macker;

                I use GPU render a few years now and I'm very happy - it's fast and stable, only some CPU feature are missing. At the moment I use a 3090+2x2080ti. If you own one powerful card only and the memory isn't enough, than install a cheap second card for system display use. So you could keep your work card free for rendering.
                www.simulacrum.de ... visualization for designer and architects

                Comment


                • #9
                  Fired it up for the first time in years today and immediately crashes max. I'll try it again in another 3 or 4 years...
                  James Burrell www.objektiv-j.com
                  Visit my Patreon patreon.com/JamesBurrell

                  Comment


                  • #10
                    You should post some numbers here. How does a typical sceen look like? What are your hardware specs?
                    https://linktr.ee/cg_oglu
                    Ryzen 5950, Geforce 3060, 128GB ram

                    Comment


                    • #11
                      Originally posted by Pixelcon View Post
                      Fired it up for the first time in years today and immediately crashes max. I'll try it again in another 3 or 4 years...
                      That may well keep happening, no matter how long you'll wait.

                      Some care is needed in setting up GPU projects, as there are inherent, unavoidable limitations and we never made a secret of the fact that one should not expect the GPU engine to run (well, or at all) on CPU projects ("V-Ray GPU is a separate render engine")
                      It's also quite dependent on the setup one uses, from card model to driver version.
                      Lastly, three to four years is generally the timeframe for a kernel overhaul (read, near-complete rewrite.), as in that period a number of techs (hardware and software) will have changed, and the engine needs to keep up.
                      So you may end up getting hit by teething issues if your timing is unlucky.

                      It's not ever going to be as easy to use, or as flexible, as a CPU engine, no matter how one cuts it, as it depends on a very different coding paradigm, and a number of ancillaries out of our control (f.e. buggy drivers.).
                      If one wants to use the GPU engine, the best approach is to commit to getting a render done, starting from getting one's PC in order on the drivers' side, then through the basic scene setup, which should be conformant to the guidelines mentioned here, and then sticking to the approach through the issues one may face, as it will inevitably require re-adjusting if one's built experience with other engines.

                      The payback for doing it right is exceptional speed, of course, which has to be weighted against the time spent re-learning the craft, possibly quite sizeable initially.

                      If however you feel the issue is with the GPU engine itself, please substantiate it with a few more details, so the devs can look into it.
                      Lele
                      Trouble Stirrer in RnD @ Chaos
                      ----------------------
                      emanuele.lecchi@chaos.com

                      Disclaimer:
                      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                      Comment


                      • #12
                        Yes I understand one cannot expect to just throw a scene built for CPU at the GPU and expect it to work. Thanks for the information you've provided. I will indeed try to re-build a scene during some spare time to see just how useful it might be in my case.
                        James Burrell www.objektiv-j.com
                        Visit my Patreon patreon.com/JamesBurrell

                        Comment


                        • #13
                          Thanks for the input Lele, I will add that drivers are very important for V-Ray 5 update 2. You will need 496 game ready drivers for best performance, it also solves many issues
                          You will find the latest recommended driver at the top of the GPU page here

                          Best,
                          Muhammed
                          Muhammed Hamed
                          V-Ray GPU product specialist


                          chaos.com

                          Comment


                          • #14
                            It would be great if there could be a direct note at the V-ray plugins if a new recommended driver is available.
                            www.simulacrum.de ... visualization for designer and architects

                            Comment


                            • #15
                              Originally posted by Muhammed_Hamed View Post
                              Thanks for the input Lele, I will add that drivers are very important for V-Ray 5 update 2. You will need 496 game ready drivers for best performance, it also solves many issues
                              You will find the latest recommended driver at the top of the GPU page here

                              Best,
                              Muhammed
                              Why the game ready driver? I thought the studio drivers are extra build for vray.?
                              https://linktr.ee/cg_oglu
                              Ryzen 5950, Geforce 3060, 128GB ram

                              Comment

                              Working...
                              X