Announcement

Collapse
No announcement yet.

V-ray 3.5501 RT/GPU render issue

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • V-ray 3.5501 RT/GPU render issue

    Guys, I am asking for support from the pros. Currently, I am testing V-ray RT-GPU. Not sure what I am doing wrong. When rendering with GTX 980 ti, the scenes takes a very long time to load and render very slow as well and even pause in the process and at times don't even render. The max file size is only 29.964MB

    Troubleshooting
    I have resized the textures with V-ray texture resize option.
    Resize HDRI to about 3K
    Mesh such as sofa and bed converted to V-ray proxy
    Swap around the video cards to different PCI ports on the Motherboard

    Question -
    Should I just convert the textures in photoshop (what texture size recommend for 2K-4K render)
    Upgrade the video card
    Or just use CPU only
    Would overclocking the GPU helps

    I have done single car rendering. The speed and load time was ok. Just want to put that info out there as well.


    Render engine; Vray nightly 3.5501
    3D program: 3DS max 2017

    Computer specs
    Motherboard: Z10PE-D16 WS
    CPU: 2 Xeon E5-2683 V3 - (water cool)
    Video cards: Nvidia GTX 980 ti ((6gb ram) )-for RT GPU rendering
    Video cards: Nvidia Quadro 6000 (6gb ram) -for viewport performance
    2 Ram: 16gbs of Samsung ram = 32gbs
    Drive: SAMSUNG 850 EVO 2.5" 250GB SATA
    Operating system: Windows 10 Pro

    Thank you all in advance!
    Thank you,
    Garth

  • #2
    Hmm... this sounds strange. Can you post a screenshot of a scene that is affected by this problem so we can see how complex it is? Did you render with the default render settings? Did you choose CUDA as the engine? What happens if you apply a clay material to everything? Did you try different drivers? I'm using Vray 3.5 Beta 1 with a single GTX 980 (non Ti) quite often and I've not noticed any problems even with more complex scenes as long as they fit in the memory. I've render cars and some moderately complex interiors and exteriors without issues. Does this slowness still appear in a very simple scene like a teapot and a single vray light for example?
    Max 2023.2.2 + Vray 6 Update 2.1 ( 6.20.06 )
    AMD Ryzen 7950X 16-core | 64GB DDR5 RAM 6400 Mbps | MSI GeForce RTX 3090 Suprim X 24GB (rendering) | GeForce GTX 1080 Ti FE 11GB (display) | GPU Driver 546.01 | NVMe SSD Samsung 980 Pro 1TB | Win 10 Pro x64 22H2

    Comment


    • #3
      Originally posted by garth100 View Post
      Not sure what I am doing wrong. When rendering with GTX 980 ti, the scenes takes a very long time to load and render very slow as well and even pause in the process and at times don't even render. The max file size is only 29.964MB
      Would it be possible to send us that scene to support@chaosgroup.com
      Please include a link to this tread and let us know if there are other scenes which have the same issue or this is the only one.
      Is there any difference in the exporting times if you disable the texture resizing or if you disable the textures completely?

      Originally posted by garth100 View Post
      Should I just convert the textures in photoshop (what texture size recommend for 2K-4K render)
      You can do that of course, this should save up some exporting times but we can't predict how faster it would be so it might be a good idea to test it.

      Originally posted by garth100 View Post
      Upgrade the video card
      Faster cards will certainly help reducing render times but it would be better to first investigate the scene since there might be some bugs in the software.
      Since you already have two cards, GTX and Quadro - did you try to render the scene with both of them?

      Originally posted by garth100 View Post
      Or just use CPU only
      It is a good idea to test how this scene renders on the CPU. Check the export and render times and compare them with the ones from the GPU.

      Originally posted by garth100 View Post
      Would overclocking the GPU helps
      It depends how stable is the GPU after the overclocking. I don't have a stable overclocking background and don't know how much faster the card could be when overclocking it but I suppose that every % above would help reducing render times.
      Svetlozar Draganov | Senior Manager 3D Support | contact us
      Chaos & Enscape & Cylindo are now one!

      Comment


      • #4
        Thank you Guys,
        I will do some more testing and upload the file for further testing later today as well.
        Thank you all!
        Thank you,
        Garth

        Comment


        • #5
          Keep testing.

          I have completely moved to GPU rendering because of the magnitude of render speed I get over CPU. Sorry, but I don't have an idea on your issue, but what I do, is isolate things, test render in simple scenes, take objects out and put them into a clean file and test. GPU rendering is a delicate dancing around eggshells, more so than using vray adv. It requires really knowing what all is going on, becoming a master at it (not that I am). If you don't understand something, take some time to figure out what it is...you will be glad you did.

          Use active shade for real-time stuff and building, production render with low noise settings (like .1 or .05), region render for areas to fix. Try rebuilding some stuff with your textures and test render times and load times and try and figure where the problem is.

          The GPU engine I believe totally approaches things from a different angle than CPU...the textures are baked in in some cases, and handled differently, to the best of my knowledge.

          You are on the right track, once you get the knowledge down for your area, you can get things done very quickly. I hope this helps.

          One other thing, I found I have to approach the beginning building of a scene from the GPU rendering perspective, because there are things I cannot use in GPU or have to do differently than CPU rendering...I have found I cannot take a scene that I rendered and built using vray ADV and expect it to work with RT GPU CUDA mode. That is why building a scene from the beginning while rendering along the way, saving, reloading, etc., all helps you become a master (or your own master at least) at GPU rendering.

          I haven't used vray proxies, my hdri's are 10kx5k and load and render fine, I have a max files size of 150MB and it loads fine, OCing my GPUs (2x980ti's classified with custom flashed bios') helped me gain ~20% speed (I actually got about 25% but the temps and fans were running high so I backed off a bit).
          Last edited by biochemical_animations; 11-01-2017, 07:42 AM.

          Comment


          • #6
            Totally understand. I will try to do and upload a video on the progress.

            I really think GPU is the way to go due to the speed and quality. I did over clock the GPU last night from 1200mhz to 1500mhz. I did try a simple car scene rendering on the GPU. The render time was under a minute(crazy fast). More update to come though.

            Thank you!!
            Thank you,
            Garth

            Comment


            • #7
              My apologies for the delay. I have done some modification to the scene. The scene did render but I find it to be slow. (Just a test)The same scene loaded and render fast with Fstorm. Maybe I am missing a step at this point. I did discovered that when rendering simple scene the GPU render very fast. I guest GPU render is not for everybody. Please see the attached images and video.

              Resolved issues:
              Unsupported materials- cg-source multi texture map and color correction

              Addition corrections:
              Resized HDRI to 2K
              Removed unnecessary maps.
              Resized materials - total map/textures size are 81.3MB
              Attached Files
              Last edited by garth100; 16-01-2017, 01:15 PM.
              Thank you,
              Garth

              Comment


              • #8
                Thank you,
                Garth

                Comment


                • #9
                  Have you tried the TriPlanarTexture? I've had to put an unsupported Bercon map procedural inside it to get it to (sort of) work, but it worked well enough. However, if I went to crazy with the layers, it stopped working.

                  Comment


                  • #10
                    Originally posted by biochemical_animations View Post
                    Have you tried the TriPlanarTexture? I've had to put an unsupported Bercon map procedural inside it to get it to (sort of) work, but it worked well enough. However, if I went to crazy with the layers, it stopped working.
                    The triplanar texture on the GPU does not support procedural textures.
                    It works with the Bercon, because we try to bake the unsupported procedurals to bitmaps.
                    Best,
                    Blago.
                    V-Ray fan.
                    Looking busy around GPUs ...
                    RTX ON

                    Comment


                    • #11
                      Currently I am running max 2017 and I don't think bercon compatible with max 2017. I will remove the triplaner and try again.

                      Question- if Vray RT detected unsupported mat/textures, will it attempt to render and also in return use up a lot of memory trying to calculate the scene?
                      Thank you,
                      Garth

                      Comment


                      • #12
                        Originally posted by garth100 View Post
                        Question- if Vray RT detected unsupported mat/textures, will it attempt to render and also in return use up a lot of memory trying to calculate the scene?

                        If V-Ray GPU detects unsupported texture it prints warning about that in the log and tries to bake it to bitmap.

                        Best,
                        Blago.
                        V-Ray fan.
                        Looking busy around GPUs ...
                        RTX ON

                        Comment


                        • #13
                          Got it. Thanks!
                          Thank you,
                          Garth

                          Comment

                          Working...
                          X