Announcement

Collapse
No announcement yet.

Different machines, same GPU, different GPU results?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Different machines, same GPU, different GPU results?

    I’m curious as to what is causing different GPU rendering results. I’m running a bunch of tests on various GPUs (3090, 3080s, 2080s, sli, etc.)

    1. I assembled a newer computer, my first AMD in over 12 years. It’s an AM4 5900 on a new MB that supports PCIE v4.0. 32gb RAM (newer ram I forget which kind.)
    2. My older computer is from 2018 with top of the line Intel hardware. Pcie 3.0 max. 64 gb ram. Still very good hardware.
    3. Both the same exact OS and nvidia studio drivers.
    4. Older computer with an EVGA 2080 Black TI (12gb onboard) scores around 1200 on the “older computer.” Newer computer the same card hits 1361.

    I understand it’s different underlying and faster hardware on the newer machine, but the 2080 supports pcie v3.0 max and I would think that since most of the processing occurs only on the GPU, the speeds should be near identical. Only thing I can think of is that the newer computer is less bottlenecked somewhere? Other than r/w speed to onboard ram and a newer nvme, what could cause this? I didn’t think those made that much of a difference. I have yet to test the higher end GPUs which take advantage of pcie 4.0.
    LunarStudio Architectural Renderings
    HDRSource HDR & sIBL Libraries
    Lunarlog - LunarStudio and HDRSource Blog

  • #2
    Originally posted by jujubee View Post
    4. Older computer with an EVGA 2080 Black TI (12gb onboard) scores around 1200 on the “older computer.” Newer computer the same card hits 1361.
    Hi jujubee

    This is fairly normal and expected to happen

    The main factor here is the different CPUs, the 5900X is able to boost near 5 GHz on one or 2 cores. the 5900X and 5950X are exceptional CPUs in single-threaded performance, this directly impacts your GPU scores
    Good single-threaded performance affects any benchmarks on the system actually, from opening 3Ds Max to things like gaming and NVMe drive performance. 3Ds Max itself will be much faster as many of the modifiers are single-threaded

    Another factor is system memory, Ryzen CPUs will even offer better performance if you have a good, fast memory kit. CL14 memory offers exceptional performance but can be expensive
    This doesn't happen with Intel CPUs, you can run the most expensive memory kit on Intel and barely notice a difference

    The difference in scores doesn't have to do with PCIe bandwidth, V-Ray GPU in the Benchmark tool will perform the same between PCIe Gen 3 and Gen 4
    From PCIe Gen 3 x4,x8,x16 you should be fine, and PCIe Gen 4 x4,x8,x16 will all perform the same in the benchmark tool, if you have a heavy scene in V-Ray GPU you will start noticing a difference in the time where the scene is loaded into GPU memory, but rendering itself will be mostly the same
    You can verify this by going to your Bios and setting your PCIe link to Gen 3, you will need to have a good and probably expensive board for this to be accurate.
    You can also use a specific link like Gen 4 x4 or Gen 4 x8

    You need to keep in mind that this PCIe bandwidth becomes very important for Chaos Vantage, specially if you use multiple cards. You will need PCIe Gen 4 for this to avoid any bottleneck

    Cooling could be part of the different scores, if you install the GPUs in different cases. But this is a different topic

    I have access to all of the GPUs you listed, if you notice something odd in your testing let me know. I can double check the results

    Best,
    Muhammed
    Muhammed Hamed
    V-Ray GPU product specialist


    chaos.com

    Comment


    • #3
      So many different factors. I didn't realize that motherboard RAM and the processor still had such a major impact. Thank you for explaining.
      On the AM4, do you see much difference between CL14 vs. CL16 when it comes to performance or is that hardly noticeable? I have CL16 and wondering now if it is better to upgrade.

      As you said, I'm noticing my benchmarks are mostly linear but I've only been selecting VRay GPU since something doesn't exist for this.


      So here's my big question when it comes to Vantage that maybe you can answer since I will probably sell some cards. Am I better off:

      1. Older Intel Z390 I9-9900 (LGA 1151) with a 3090 and 3080 (or 3090) running at PCIe 3. PC4 17000. I realize the 3080 bottlenecks the memory of the 3090 but I suppose the 3080 could always be disabled should a scene requires it.)

      2. Newer AM4 board 5900x with a 3090 PCIe 4. CL16 DDR3600. I believe if I try to add a card to the additional PCIE slot it runs down to x4 and reduces the available lanes which I think is where Intel boards come out ahead.
      LunarStudio Architectural Renderings
      HDRSource HDR & sIBL Libraries
      Lunarlog - LunarStudio and HDRSource Blog

      Comment


      • #4
        Originally posted by jujubee View Post
        do you see much difference between CL14 vs. CL16 when it comes to performance or is that hardly noticeable? I have CL16 and wondering now if it is better to upgrade.
        3600 at CL16 is great, no need to upgrade unless you find a deal on CL14 memory maybe. They are quite hard to find
        CL14 3200 will be 7 to 10% faster or so


        Originally posted by jujubee View Post
        So here's my big question when it comes to Vantage that maybe you can answer since I will probably sell some cards
        Second option for sure!
        5900X is much much better than a 9900, the single threaded performance will make a big difference. AM4 overall is much better and future proof platform
        And PCIe Gen 4 alone is more than enough reason, PCIe Gen 3 x8 doesn't cut it for Vantage

        If you can get a second GPU for your monitors and viewport performance so you keep the 3090 for rendering, this will be helpful
        3Ds Max and windows will take 2-3 GB of memory on the card connected to your monitors.
        This could be a 3060 for example, it will be more than enough for viewport performance


        Originally posted by jujubee View Post
        I believe if I try to add a card to the additional PCIE slot it runs down to x4 and reduces the available lanes which I think is where Intel boards come out ahead.
        Nope, you can add your 3090 to the first PCIe slot.. it will run at x16 PCIe Gen 4
        Then add the second GPU to the third PCIe slot(where you will connect your monitors), it will run at x8 PCIe Gen 4 (this bandwidth comes from the X570 chipset).. this is much better than what Intel could do I have this setup right now with 2x 3090s
        One motherboard I recommend for this is ASUS WS X570 PRO ACE because the third slot runs at x8

        This really depends on the board, most X570 boards will have the third PCIe slot running at x4 PCIe Gen 4 and this bandwidth will be shared with other devices(so make sure you install only one m.2 NVMe drive)

        If you use the first and second PCIe slots, they you will run at x8, x8 PCIe Gen 4 .. this is fine as well, but the issue now is thermals.

        Best,
        Muhammed
        Last edited by Muhammed_Hamed; 08-03-2022, 09:58 AM.
        Muhammed Hamed
        V-Ray GPU product specialist


        chaos.com

        Comment


        • #5
          5900X is much much better than a 9900, the single threaded performance will make a big difference. AM4 overall is much better and future proof platform
          And PCIe Gen 4 alone is more than enough reason, PCIe Gen 3 x8 doesn't cut it for Vantage
          I had a 3090 and a 2080 in the AM 4 board and it ran decently.
          The minute I tried a 3080 and 3080 in the same machine, it wouldn't boot.
          The PSU is 1200W which should've been enough, although I just learned (massive crash and chat with EVGA tech) that it's wiser to use three separate PCIE-4 pin cables for the 3090 instead of 2. So maybe I can get it working again

          I still had better benches from my older I9 board. (Vray bench) as I was able to use a 3090 and a 3080 in the same PC.

          One motherboard I recommend for this is ASUS WS X570 PRO ACE because the third slot runs at x8
          I wish I had known this before making my purchase. Does that also clock down with a secondary nVME?
          I had trouble finding new cards and was able to get cards through buying motherboard combos...
          LunarStudio Architectural Renderings
          HDRSource HDR & sIBL Libraries
          Lunarlog - LunarStudio and HDRSource Blog

          Comment


          • #6
            Originally posted by jujubee View Post
            The minute I tried a 3080 and 3080 in the same machine, it wouldn't boot.
            The PSU is 1200W which should've been enough
            Do you get any errors on the Debug LEDs when it didn't boot. FYI this is not expected to happen, pretty much any descent AM4 B550/x570 board is able to house 2x 3090s, it could be something basic to solve
            1200W is more than enough for 2x 3090s


            Originally posted by jujubee View Post
            that it's wiser to use three separate PCIE-4 pin cables for the 3090 instead of 2
            As long as you are using native power supply cables, you should be all good. Usually one 8 pin cable from the power supply is split into 2x 8 pins, it is totally fine in this case and shouldn't cause any issues
            Some 3090s will require 3x 8 pins like FTW3 or Kingpin cards from EVGA, in this case you can use one cable which splits into 2x 8 pins. Then another 8 pin cable for the third connection
            cheaper 3090s will use only 2x 8 pins and it is safe to use one native 8 pin for that, which splits into 2x 8 pins

            But don't use your own splitter cables after the native power supply cable, this is where you would get crashes and instability


            Originally posted by jujubee View Post
            Does that also clock down with a secondary nVME?
            Yes, the third slot on any AM4 board will have bandwidth from the X570 chipset(not directly from the CPU)
            So this bandwidth is shared with other devices. In the case of X570 PRO ACE, it uses x8 Gen 4 shared instead of x4 Gen 4 in any other board

            Best,
            Muhammed
            Muhammed Hamed
            V-Ray GPU product specialist


            chaos.com

            Comment


            • #7
              Do you get any errors on the Debug LEDs when it didn't boot. FYI this is not expected to happen, pretty much any descent AM4 B550/x570 board is able to house 2x 3090s, it could be something basic to solve
              1200W is more than enough for 2x 3090
              i miswrote. It was a 3090 and 3080 that wouldn’t even boot. But I have used random pcie cables that likely came from different sources in many of my computers. I honestly didn’t think that made much of a difference at first. No led warnings.

              But don't use your own splitter cables after the native power supply cable, this is where you would get crashes and instability
              I am/was running into issues with the 3090 evga that has 3 8-pins. I updated the firmware on the card and changed drivers. It seemed better for a little bit until a hard crash 2 days ago. Then I replaced the PSU with a new 1300w and ran native cables each separate to all of the pcie slots. I also spent extra time cleaning out radiators and the case. Then I installed a newer studio driver. Now it’s running much quieter which is surprising as I thought most of the noise was from the cards and not the psu.

              no crashes as of yet, but each time I have had a hard/severe crash, it is whenever I expanded a vraymtl from the 3ds max compact editor slot while running vantage. I notice my card fans ramping up very heavily and it’s a sign that the system will quickly go down. As long as I leave the compact vray material editor alone and not max a preview, it stays stable. Even processing heavy scenes with vantage appears to run stable.

              I’m keeping a close eye on this. There could also be an issue with the card itself according to evga.

              LunarStudio Architectural Renderings
              HDRSource HDR & sIBL Libraries
              Lunarlog - LunarStudio and HDRSource Blog

              Comment


              • #8
                Originally posted by jujubee View Post
                But I have used random pcie cables that likely came from different sources in many of my computers.
                It is not a good practice, and may lead to death of parts. the cables are not swappable between brands or even different models under the same brand

                About the instability, stress test this card with Kombustor or Furmark or Heaven for a few hours. You will know if there are issues, these workloads push the GPUs harder than GPU rendering. Keep an eye on VRAM and GPU chip temperatures

                Best,
                Muhammed
                Muhammed Hamed
                V-Ray GPU product specialist


                chaos.com

                Comment


                • #9
                  I really had no idea. In theory, that makes complete sense. I've had so many cables and and PSUs - at least 25 computers and I never encountered issues until this software ramped up my 3090. I must have donated over 100 extra PCIE cables last year and now I'm a little upset lol...
                  I did replace the 1500W PSU with a newer 1300W PSU and used the cables that came with it. I made sure that every connector was not using any splitter (all separate) and so far everything has been stable since then.
                  But to be fair, I'm too busy to run some stress tests but will try when I have free time.

                  I also notice that my GPU fans ramp up when I have a material preview enlarged/open and this is when it happened, as well as more complex scenes processing in Vantage in the background. My workaround so I can continue is to also Pause Vantage processing, and I also don't leave the Material Previews in Max open for too long now. Maybe it's doing something, but maybe it is not.

                  I really appreciate all of your AWESOME help btw - Vlado and Peter obviously picked an awesome person to be on the team. I had thought Vlado might be one of the smartest people on the planet when I had met him, but you're I'm thinking your pretty knowledgeable too.
                  LunarStudio Architectural Renderings
                  HDRSource HDR & sIBL Libraries
                  Lunarlog - LunarStudio and HDRSource Blog

                  Comment


                  • #10
                    Originally posted by jujubee View Post
                    at least 25 computers and I never encountered issues until this software ramped up my 3090
                    Yeah, this happens more on the high-end GPUs.. so 3090s or 3080 Tis specially something like EVGA FTW, it comes with a higher TDP than most 3090s, and higher specs out of box
                    So it would push 2 of the 3 8-pin connectors to the limit


                    Originally posted by jujubee View Post
                    I must have donated over 100 extra PCIE cables last year and now I'm a little upset lol
                    as long as they are not used on high-end GPUs I think it might work fine.. but to be safe swapping should be avoided
                    Sata cables on the other hand are a different story, swapping Sata cables = guaranteed dead parts


                    Originally posted by jujubee View Post
                    I made sure that every connector was not using any splitter (all separate) and so far everything has been stable since then.
                    Perfect!

                    Originally posted by jujubee View Post
                    I also notice that my GPU fans ramp up when I have a material preview enlarged/open and this is when it happened
                    3Ds Max thumbnails are not rendered on CPU, this is not expected to happen. it is probably something different, I will keep an eye on that
                    You can have full control over the fans in many apps like AfterBurner or Fan Control, you can even set speed for each of the 3 fans separately if needs be
                    Or you can set a fan curve for all fans to the noise levels you like, it should be enough. GPU fans can be very loud. Maybe use 60% fan speed as a maximum value in your curve

                    Thanks for the kind words, it means a lot
                    I used to take commissions on building high-end rendering machines/servers prior to Covid, and it is still something that I love whenever I have time.

                    Best,
                    Muhammed
                    Muhammed Hamed
                    V-Ray GPU product specialist


                    chaos.com

                    Comment


                    • #11
                      as long as they are not used on high-end GPUs I think it might work fine..
                      I had been running 2 2080 EVGA RTX FTW on the same Motherboard without any issues, but then again I was using Octane. I do realize that these latest cards are a beast.

                      You can have full control over the fans in many apps like AfterBurner or Fan Control, you can even set speed for each of the 3 fans separately if needs be
                      Completely reinstalled the OS and was trying to keep everything super clean, but I ended up installing EVGA Precision - mostly because it allowed me to update the card BIOS.

                      3Ds Max thumbnails are not rendered on CPU, this is not expected to happen.
                      I'm not quite sure why it's doing this if it's coming from Vray in Max or when Vantage is running, but I noticed my fans go from silent to whirling whenever I enlarge a mat from the CME to take a closer look.
                      It actually makes me nervous so I'm constantly opening them quickly and closing because that's when crashes have mostly occurred.
                      I don't quite understand why because when it's rendering in Vantage, the fans are generally quieter unless it's a massive scene.
                      LunarStudio Architectural Renderings
                      HDRSource HDR & sIBL Libraries
                      Lunarlog - LunarStudio and HDRSource Blog

                      Comment

                      Working...
                      X