Announcement

Collapse
No announcement yet.

GPU render slower and different from CPU render

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GPU render slower and different from CPU render

    Hi all,

    I recently started to test with GPU rendering so pardon my questions, they come from a rookie.

    My test scene is all interior lighting. I have only rectangular Vray lights in the ceiling to illuminate all. I know it is only GI so it it hard to render and it takes long times. I decided to do a test with VRay RT in production mode since it should be faster, usually. For my surprise it took three times longer, it is darker, lights in the ceiling look very different, materials are looking different too.

    With this scene downloaded from the web, the rendering time dropped from 45m to 7 min! Huge increase in performance.
    http://www.archvizscenes.com/content...rner-for-vray/

    Any advice? My guess is since VrayRT uses brute force, it is slower in interior scenes and you only notice the true power with external/sky scenes with not a lot of bounces? Is it hard to beat the times of IR+LC ?

    Thanks all

    CPU render:
    Click image for larger version

Name:	CPU.jpg
Views:	1
Size:	262.2 KB
ID:	885776

    GPU render:
    Click image for larger version

Name:	GPU.jpg
Views:	1
Size:	284.8 KB
ID:	885777

  • #2
    Hi,

    I can shoot in the dark with some questions but this can take a while to get to the bottom of it. Can you send me the scene ? peter@matanov@chaosgroup.com
    If it was that easy, it would have already been done

    Peter Matanov
    Chaos

    Comment


    • #3
      Slizer,
      I cannot share the scene unfortunately, will get me in trouble. Can you please share your thoughts anyway? I will try to trouble shoot following your advice. All lights in the scene are Vray rectangular lights with 8 subdivisions. Materials are mostly vray standard with just a diffuse+bump layers. I am using a 3dsmax free camera, not vray physical. Here are the other settings used:


      Click image for larger version

Name:	1.png
Views:	1
Size:	81.7 KB
ID:	864084Click image for larger version

Name:	2.png
Views:	1
Size:	84.5 KB
ID:	864085

      Thank you!

      Comment


      • #4
        Since your CPU render is full of splotches and your GPU render is fairly pristine, I wouldn't call it a fair comparison in rendering times. Crank up the biasing in your CPU rendering until it is as clean as your GPU render and then let's see how they compare. And by the way, what are you using as a GPU versus your CPU processor?

        Also, in your GPU rendering, are you using the Light Cache/Brute Force option?

        -Alan

        Comment


        • #5
          Originally posted by Alan Iglesias View Post
          Since your CPU render is full of splotches and your GPU render is fairly pristine, I wouldn't call it a fair comparison in rendering times. Crank up the biasing in your CPU rendering until it is as clean as your GPU render and then let's see how they compare. And by the way, what are you using as a GPU versus your CPU processor?

          Also, in your GPU rendering, are you using the Light Cache/Brute Force option?

          -Alan
          Like Alan said, fair comparison can be done only if you are using same settings for both CPU and GPU renders - BF/LC vs BF/LC or BF/BF vs BF/BF, with same noise threshold.
          If it was that easy, it would have already been done

          Peter Matanov
          Chaos

          Comment


          • #6
            Thank you Alan and Slizer for you responses. Fair enough what you said....As I mentioned, in the first test, I just switch the rendering engine to GPU without tweaking any values. Now that you brought to my attention, I saw that the default level of noise was 0.001 for the GPU and 0.01 for the CPU. This was indeed the reason why it took longer. Run a new test, same exact settings for BF+LC and the result it double the speed for the GPU!
            I am using a GTX 1080 vs Xeon E5-1650 v4
            There is still some difference in the shadows, but maybe Ambient occlusion was disabled for the GPU. I am pretty sure that was the cause, didnt check.
            Here the results:

            CPU vs GPU:
            Click image for larger version

Name:	1-cpu.jpg
Views:	1
Size:	238.1 KB
ID:	864104 Click image for larger version

Name:	2 -gpu.jpg
Views:	1
Size:	253.0 KB
ID:	864105

            Comment


            • #7
              Yes, ambient occlusion is not supported on GPU.

              Those black squares above the servers and the one to the right of the door worries me, can you give some info what is there geometry-wise ?
              If it was that easy, it would have already been done

              Peter Matanov
              Chaos

              Comment


              • #8
                Originally posted by slizer View Post
                Yes, ambient occlusion is not supported on GPU.

                Those black squares above the servers and the one to the right of the door worries me, can you give some info what is there geometry-wise ?
                It is not a very clean geometry. The models has been exported from Revit to 3dsMax as dwg files. The square nearby the door should be an electrical outlet, simplified as a box. The other ones are standard meshes, coming from Revit as dwgs. Maybe normals are flipped, or other anomalies. I'm pretty sure is not connected to vray, whatsoever : )

                Thanks again for your feedback Slizer.

                Comment


                • #9
                  ...This was indeed the reason why it took longer. Run a new test, same exact settings for BF+LC and the result it double the speed for the GPU!
                  Nice!

                  Now add more GPUs! And don't forget that the Denoiser works great with RT. Makes interiors even faster!

                  Have fun,

                  -Alan

                  Comment


                  • #10
                    Originally posted by Alan Iglesias View Post
                    Nice!

                    Now add more GPUs! And don't forget that the Denoiser works great with RT. Makes interiors even faster!

                    Have fun,

                    -Alan
                    Thanks for the support Alan! Out of curiosity, (it is not my intention), but can you use 2 different series of GPUs? Like 1080 +980ti?
                    Peace...
                    Last edited by sheehan_partners_nm; 02-11-2016, 04:47 PM.

                    Comment


                    • #11
                      Originally posted by sheehan_partners_nm View Post
                      Thanks for the support Alan! Out of curiosity, (it is not my intention), but can you use 2 different series of GPUs? Like 1080 +980ti?
                      Peace...
                      If the GPUs are the same architecture (Fermi, Kepler, Maxwell, etc.), then I would certainly think so, but y'know, I have not tried cross-architecture myself and I'm thinking other folks could be pulling it off, but hey - is anyone else out there that has actual experience that could give this person a hand?

                      Peace right back at you,

                      -Alan

                      Comment


                      • #12
                        There shouldn't be any problem mixing those two.
                        If it was that easy, it would have already been done

                        Peter Matanov
                        Chaos

                        Comment

                        Working...
                        X