Announcement

Collapse
No announcement yet.

GPU/CPU comparison

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GPU/CPU comparison

    Hello,

    I thought I would take my new cards for a spin. I rendered a simple indoor scene in 4k. HDRI dome, portal lights and IES pot lights.

    i7-6800k 3.4 GHz - 10h 16m 47.2s
    2 x GTX 1080 8GB - 3h 50m 00.0s (roughly)

    I used default render settings in both cases with a noise thresh of 0.005 (Vray 3.50.04, Max 2017, Windows 10).

    A few things I noticed (GPU left, CPU right):

    Stainless steel shader looks very different. Particularly in the reflections, seems noisier. GPU is cooler and appears quite a bit brighter. IES shadows are also blurrier in GPU.

    Also, I thought images like this would render much faster on 2 GTX 1080's. Am I doing something wrong?

    I know the two renderers are meant to be different. What are the reasons for these differences?
    Attached Files

  • #2
    Is it possible to provide me with that scene (you can send it to vlado@chaosgroup.com)? Then I'll be able to tell you more.

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      Originally posted by vlado View Post
      Is it possible to provide me with that scene (you can send it to vlado@chaosgroup.com)? Then I'll be able to tell you more.

      Best regards,
      Vlado
      I have sent the google drive link to your email. Thank you.

      Comment


      • #4
        Any luck with this file?

        Comment


        • #5
          I forwarded it to our QA guys to check it out, will let you know as soon as there is some news.

          Best regards,
          Vlado
          I only act like I know everything, Rogers.

          Comment


          • #6
            Hi,

            As a start I want to point that V-Ray CPU and V-Ray GPU are two different engines and as much as we want to make them output the same results in some setups this will not be possible.

            In your scene the biggest difference between those two renders is in the lighting. The cause is the use of light portals which are pretty much very differently calculated in terms of rendering with GPU and CPU. Making them "Simple" will somehow make the results closer but they will never be the same. There is also a little bug in the textured VRayLightMtl which also contributes to the difference in the lighting.

            V-Ray GPU handles textures a little different than CPU, that's why the bump in the steel shader looks different. If you want it to look the same and still render with "Full size textures", you can resize the texture manually to something smaller (256x256).
            If it was that easy, it would have already been done

            Peter Matanov
            Chaos

            Comment


            • #7
              well there goes my hopes for cpu/gpu every other frame when rendering animations. I guess I could cross dissolve 2 half's of the sequence. Conceptually, would it be possible, like in 10 years, to render every other pixel CPU/GPU?

              Comment


              • #8
                Originally posted by iancamarillo View Post
                well there goes my hopes for cpu/gpu every other frame when rendering animations. I guess I could cross dissolve 2 half's of the sequence. Conceptually, would it be possible, like in 10 years, to render every other pixel CPU/GPU?
                Currently mixing CPU/GPU for animation will definitely not work.
                However, as long as you are okay to have the CPU render matching the GPU one (and not the vice versa), I think it will be available much (much) sooner.

                Best,
                Blago.
                V-Ray fan.
                Looking busy around GPUs ...
                RTX ON

                Comment


                • #9
                  cool. glad to hear that

                  Comment


                  • #10
                    Originally posted by slizer View Post
                    Hi,

                    As a start I want to point that V-Ray CPU and V-Ray GPU are two different engines and as much as we want to make them output the same results in some setups this will not be possible.

                    In your scene the biggest difference between those two renders is in the lighting. The cause is the use of light portals which are pretty much very differently calculated in terms of rendering with GPU and CPU. Making them "Simple" will somehow make the results closer but they will never be the same. There is also a little bug in the textured VRayLightMtl which also contributes to the difference in the lighting.

                    V-Ray GPU handles textures a little different than CPU, that's why the bump in the steel shader looks different. If you want it to look the same and still render with "Full size textures", you can resize the texture manually to something smaller (256x256).
                    Great thanks for the clarification.

                    In the interest of having faster look development and scene setup we are looking to upgrade our workstations. If we purchase multiple GPU's our CPU render servers would become obsolete as the final frames would render differently.

                    If this is the case, then it makes sense to invest in faster CPU's for our workstations.

                    Comment


                    • #11
                      Not only that, but GPU's burn out way faster than CPU's do. My machine is 9 years old, still running great on the original CPU's and heatsinks. I've been through 3 burned out cards and had to RMA another simply because graphics cards don't seem to last very long at all. (Yes I checked my rails and power usage, even replaced a power supply just to be sure.)

                      GPU's just don't last very long.

                      Comment


                      • #12
                        Originally posted by Deflaminis View Post
                        Not only that, but GPU's burn out way faster than CPU's do. My machine is 9 years old, still running great on the original CPU's and heatsinks. I've been through 3 burned out cards and had to RMA another simply because graphics cards don't seem to last very long at all. (Yes I checked my rails and power usage, even replaced a power supply just to be sure.)

                        GPU's just don't last very long.
                        Well that's not very assuring, I wasn't aware they burned out.

                        Comment


                        • #13
                          Maybe my machine is poison, even though I tested everything I could think of and had Supermicro come out and look at the box but their opinion was that everything was fine. I put in some extra cooling and everything.

                          My wife is also in 3D and we've had to replace 2 of her cards in the past 5 years. Granted they are all cheap $300 GTX cards, but still... seems a pain. My processors are going strong all this time.

                          I know there are some cool advantages to GPU rendering but dollar for dollar I'm just not seeing the benefit over CPU rendering. This is just my experience, to each their own. I don't think those GPU's are really meant to run 100% for days at a time, no matter what the manufacturers say.

                          Comment


                          • #14
                            Originally posted by Warifus View Post
                            Great thanks for the clarification.

                            In the interest of having faster look development and scene setup we are looking to upgrade our workstations. If we purchase multiple GPU's our CPU render servers would become obsolete as the final frames would render differently.

                            If this is the case, then it makes sense to invest in faster CPU's for our workstations.
                            We are already working in the direction of making CPUs still useful if the users decide to switch to GPUs

                            Originally posted by Deflaminis View Post
                            Not only that, but GPU's burn out way faster than CPU's do. My machine is 9 years old, still running great on the original CPU's and heatsinks. I've been through 3 burned out cards and had to RMA another simply because graphics cards don't seem to last very long at all. (Yes I checked my rails and power usage, even replaced a power supply just to be sure.)

                            GPU's just don't last very long.
                            Maybe faulty power supply?
                            If it was that easy, it would have already been done

                            Peter Matanov
                            Chaos

                            Comment


                            • #15
                              That's what I thought too. But, I just think that the problem is that the card fans monitor the GPU heat but not the heat of the RAM. Extended usage seems to just burn them out. Maybe I'm cursed, but the power supply was my first move.

                              EDIT: My wife and I have the same base machine, could be a design flaw perhaps.
                              Last edited by Deflaminis; 28-03-2017, 01:07 AM.

                              Comment

                              Working...
                              X