Announcement

Collapse
No announcement yet.

motherboards for multiple gpu setups

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • motherboards for multiple gpu setups

    so with all these new cards that are are bigger than 2 slots, are mobos keeping up and increasing spacing between their slots? something to accommodate 4 cards with 4 x16 slots.

  • #2
    to answer my own question, yes it looks like they are adding spacing. for example this one from asus https://rog.asus.com/us/motherboards...tion-10-model/

    Comment


    • #3
      Originally posted by s_gru View Post
      so with all these new cards that are are bigger than 2 slots, are mobos keeping up and increasing spacing between their slots? something to accommodate 4 cards with 4 x16 slots.
      Hey s_gru

      Nothing really changed about motherboards in recent years. Yes the GPUs are getting bigger in width, most are over 3 slots now
      The only noticeable difference is that on some Z690 Motherboards the spacing between the first and second PCIE slots is now 3 slots instead of 2

      This is not gonna be helpful much, for multi GPUs you will need riser cables or water cooling. I will write a post about this on our blog soon

      Here is my current setup with 4x 3090s, with enough spacing between them. Max temperature is 60 degrees and nearly silent operation

      You can easily find these mining racks on the web, I'm using x16 PCIe Gen 4 riser cables.

      If you have questions about this setup let me know as well

      Originally posted by s_gru View Post
      to answer my own question, yes it looks like they are adding spacing. for example this one from asus https://rog.asus.com/us/motherboards...tion-10-model/​
      No, it is an old motherboard with 2 slots between the PCIe slots. It could barely take 2x 3090s

      Best,

      Muhammed
      Muhammed Hamed
      V-Ray GPU product specialist


      chaos.com

      Comment


      • #4
        Would love to hear more about your setup Muhammed!

        Comment


        • #5
          Hey yeoldewolfe

          Here are some information on my build,

          Click image for larger version  Name:	IMG_0804.jpg Views:	166 Size:	1.61 MB ID:	1160970

          Specs:

          CPU Threadripper 3990X 64C
          Memory 256 GB 3600 TeamGroup Xtream (running at 3200 for stability)
          CPU Cooler Noctua NH-U12S (with an extra fan)
          Motherboard Asus TRX40 Zenith Extreme 2 Alpha
          Video cards 4x 3090s
          Power Supply EVGA SuperNova 2000 Watts
          Cooling fans 3x 140 MM Noctua NF A14 Chromax + 6x 120 MM Noctua Redux
          Drives 2x Samsung 980 Pro 2 TB
          Generic Mining Rack

          Goals for this build:
          • Running 4 GPUs with optimal performance, it means avoiding thermal throttling.
          • so running all 4 GPUs with enough spacing between them, the cards will be able to breath and they will run with minimum noise. Riser cables(PCIe extensions) are needed for that
          • Noise needs be reasonable for a work environment, GPU fans are very loud at 3500 RPM which should be avoided.
          • No water-cooling for this build for multiple reasons, mainly time limit and the fact that none of the equipment and components are available where I'm located now

          Mounting rack:

          I'm using a custom mining rack, I spent 40 Euros on this one. You will find similar racks online on the cheap like this one here
          These are quite customizable on the setup, and fairly easy to work with. It could equip up to 11 GPUs per machine or more provided you have enough PCIe slots on the motherboard

          Click image for larger version  Name:	Screenshot_84.jpg Views:	168 Size:	337.7 KB ID:	1160972

          Riser Cables:

          I'm using x16 PCIe Gen 4 riser cables, they are shielded and good quality. The GPUs will give you the same bandwidth and performance like being mounted directly on the board. I used these riser cables, 2 of them are 20 CM and 2 are 30 CM. You will need them to be longer if you use 8 GPUs of course, maybe 50 CM

          Motherboard:

          I'm using Asus Zenith Extreme 2 Alpha TRX40 , it is an excellent board with many useful features
          It has 4 PCIe gen 4 slots. It runs the GPUs in this configuration, X16, x8, x8, x8
          keep in mind this is PCIe Gen 4 it is more than enough bandwidth for GPU rendering on any high end GPUs

          In Bios, Above 4G Decoding needs to be enabled for the machine to boot up with more than 2 GPUs
          Resizable BAR is set to Auto(Enabled)
          PCIe_3 is set to x8 bandwidth manually, by default it uses x4 and enables the second m.2 NVMe, so running at x8 I have no access to this specific m.2 slot

          I have set memory to 3200 MHz with custom timings and voltages. This is common when all 8 slots are occupied on this platform, 3600 was not possible but I got 3200 to be stable at CL17, I'm very happy with that

          Cooling the CPU:

          I went with is Noctua NH-U12S air cooler, it is by far the best out there for 3rd gen Threadripper CPUs. It covers the whole socket, and I added an extra exhaust fan to the cooler(only one fan comes stock, which will run just fine)
          I get 60 degrees Max temperature at full load which is great, and nearly silent, and there is plenty of room for overclocking

          Airflow:

          It is important to note that an open rack like this one is different from a case for airflow. It is easy to direct air in a case, meaning there is a path for air to flow. All you need is good fans.
          This is a lot harder on an open rack, you cannot direct air easily. Airflow is important for many components, VRMs, GPUs, Chipset..etc
          I used 7 Noctua fans, one is directed at the CPU cooler. Takes care of VRAMs and makes sure the intake fan on the cooler shroud gets fresh air. Having an Aircooler helps to cool the VRMs around the CPU socket.
          I have 3 fans directed at the cards on one side of the rack, and 3 fans on the opposite side as exhaust. This makes a big difference, all GPUs get access to fresh air.

          GPU setup:

          I'm using 4x 3090 GPUs, 2 of them are Gigabyte Gaming OC, one is MSI SuprimX and one is MSI GamingX Trio
          I bought all GPUs last year, hence they are all different tastes of 3090s, they work very well in this configuration.
          I can say that the SuprimX is the best GPU of this bunch for many reasons, it pushes the hot air up through the side of the card which is great! This GPU has the best cooling out of all, and VRAM runs at the lowest temperature around 70 degrees at full load, the backplate of the card is used for cooling the VRAM, this backplate has thermal pads under which lies directly on the VRAM.
          The Gigabyte cards are the complete opposite, the VRAM cooling is horrible. Gets to 100 degrees easily, I don't think the backplate touches the thermal pads well

          Another note is that the 2 MSI cards use 3x 8 pins, while the 2 Gigabyte cards use 2x 8 pins as they use lower power limit at 350 Watts

          Click image for larger version  Name:	IMG_0617.jpg Views:	23 Size:	2.65 MB ID:	1160903

          Temperatures:

          The GPUs run at 60 degrees at full load, the machine is near silent. I have AC in this room directed to the side of the rack, it helps a bit
          CPU runs at 65 degrees at full load, at less than 1000 RPM. Plenty of room for overclocking

          Click image for larger version  Name:	Screenshot_81.jpg Views:	21 Size:	373.8 KB ID:	1160908

          I hope this helps, let me know if you have questions

          Best,
          Muhammed
          Attached Files
          Last edited by Muhammed_Hamed; 14-10-2022, 01:11 AM.
          Muhammed Hamed
          V-Ray GPU product specialist


          chaos.com

          Comment


          • #6
            This needs to be stickied somewhere! Such a wealth of information and good advice. Thank you so much for putting this together.

            You've given me a lot to think about. Originally planned to get a closed case system with 2 3090tis alongside the Threadripper 5965WX and then get one of these https://www.h3platform.com/product-detail/resources/23
            This seems to be much more feasible cost wise.

            Are you using NVlink on them or no? I'm still hoping that the 4090 might be able to do the pooling through the PCIe, but time is running out.

            Thank you so much again!

            Comment


            • #7
              nice thanks man!

              Comment


              • #8
                Originally posted by yeoldewolfe View Post
                You've given me a lot to think about. Originally planned to get a closed case system with 2 3090tis alongside the Threadripper 5965WX and then get one of these https://www.h3platform.com/product-detail/resources/23
                This seems to be much more feasible cost wise.
                It looks fine, other than the price it will run the GPUs very loud, the GPUs will be stacked and the fans will run at high RPM

                Originally posted by yeoldewolfe View Post
                Are you using NVlink on them or no? I'm still hoping that the 4090 might be able to do the pooling through the PCIe, but time is running out.
                4090s will not be capable of pooling GPU memory, Nvidia didn't mention anything through PCIe
                You will need to stick to Ampere if you need NVlink
                And yes, I do use NVlink myself, but not on this machine.

                Thank you for the kind words , I plan to have a blog post about running multiple GPUs, and could probably add that to the GPU docs at some point. We will see

                Best,
                Muhammed
                Muhammed Hamed
                V-Ray GPU product specialist


                chaos.com

                Comment


                • #9
                  Thank you for the answers. I think I'm going to stick to the 3090s. Just one final question if I may, do you happen to know how much power you're drawing when using the 4 GPUs?

                  Thank you again and cant wait for the blogpost!

                  Comment


                  • #10
                    could you do a separate external power supply for these?

                    Comment


                    • #11
                      Originally posted by s_gru View Post
                      could you do a separate external power supply for these?
                      Yes, use a 24pin splitter so that your Motherboard communicates with both power supplies for turn on/off
                      Something like this

                      Without this splitter, you will either keep the second power supply on/off all the time

                      Best,
                      Muhammed
                      Muhammed Hamed
                      V-Ray GPU product specialist


                      chaos.com

                      Comment

                      Working...
                      X