Announcement

Collapse
No announcement yet.

Ghetto build

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    a few risers here to look at...

    https://www.moddiy.com/categories/Ca...7%7D-Adapters/

    Comment


    • #17
      Originally posted by super gnu View Post
      that open rig appears to have either pcie1x or pcie 4x connectors on the risers.. youd want at least pcie 8x
      Thanks Super Gnu - I wasn't really paying attention to that detail at this stage, but that's a very important point for me to remember, so thanks for that. Isn't it actually x16 I should be looking for ?

      I like the form and materials of it one though - of course it wouldn't be too hard at all to build one myself. But I would rather have it going upwards though, like a tower case, so that I could put it under a desk / table.

      Hmmmm..... maybe I will build myself one.....

      Jez

      ------------------------------------
      3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
      Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

      Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
      ---- Updated 06/09/23 -------

      Comment


      • #18
        yeah 16x is ideal, but iirc there is almost no detectable difference when running RT if your cards are in 8x or 16x slots, and you can happily plug an 8x (or 1x , 4x) card into a 16x slot. just like you can plug a 16x card into an 8x slot.. it just hangs out of the back.. pcie is good like that.

        Comment


        • #19
          Actually, just realised I can't go with any 1080ti's on the mobo as even one would block a slot, so it'll have to be 6 x 1080ti and a slngle slot gpu on the mobo.......
          Jez

          ------------------------------------
          3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
          Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

          Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
          ---- Updated 06/09/23 -------

          Comment


          • #20
            why not 7x gtx1080 all on risers? you have 4 on shorter risers and 3 on longer ones..

            you might have an issue with the ribbon cable of the riser obscuring the air intake on the side of the lower gpus, i dunno without seeing it all laid out.

            Comment


            • #21
              Originally posted by JezUK View Post

              Hmmm, I'm not so sure about that (though I have seen a lot of videos since I posted last night and yes, people are using 2 PSU's so maybe it'd be the right thing to do, I do have another spare lying around ).....

              But my Corsair Link tells me my 4 x 1080ti's under full load pull something like 850w out of the wall and a little less into the system (circa 830w). From memory, when I just had 1 x 1080ti it was around 350w under full load. So I believe adding another 3 would take me around the 1350w mark.

              I think it'd be under 1500w by a suffficient margin but I could watch it carefully during first tests and assess it from there.

              Either way I'm committing going for this as my tests on my first ever GPU project last week showed that using RT Production my renders were taking just 6 minutes versus 25 minutes when tested under Adv CPU.

              So I totally concur with biochemical_animation that "gpu rendering is the next wave of the future"
              consider a kill-a-watt meter...$20 from amazon to test wattage pull from the outlet...test on full gpu production test render to get a measured value, then add cards and psu's as you see fit
              https://www.amazon.com/dp/B00009MDBU...a-306572288073

              btw, I totally love have the immediate feedback using active shade when creating materials and setup lighting, camer effects. The production test renders at like 1600x1200 take a 2-5 minutes and 4k 30-45 minutes...a must have for animations

              Comment


              • #22
                Originally posted by JezUK View Post

                I think you're probably right - I think PSU's tend to be at their most efficient at circa 50% max, so it'd make sense to have two PSU's.

                Would one 1500w for the 7 x 1080ti's and one 750w for the MB and Proc do the trick ?

                I'm currently looking up Frames for rigs (I don't have the facility to make my own) - here is the kind of rigs I'm seeing (they all seem 'mining' related);

                http://www.ebay.co.uk/itm/Open-Air-M...3D322559966792


                I run mine at 50-75% on each psu and they run fine...I'd say 75% is tops...plus if you use hybrid rendering plan for CPUs power consumption added to gpus.

                Comment


                • #23
                  The thing about pcie is the gen either 1, 2, or 3 and the lanes either 1x, 4x, 8x, or 16x

                  google the speeds of these and read up on that...that is for data transfer. So when the gpu gets a scene it loads it on the gpu...that's the data speed...that depends on how big you scene is. Once it gets the scene, it does the rest to render it...less data transfer at that point...

                  even gen 1 1x lane is like 240mB/sec

                  here is a good read...I had to read it a few times

                  http://www.tested.com/tech/457440-th...d-thunderbolt/

                  Comment


                  • #24
                    Originally posted by super gnu View Post
                    why not 7x gtx1080 all on risers? you have 4 on shorter risers and 3 on longer ones..

                    you might have an issue with the ribbon cable of the riser obscuring the air intake on the side of the lower gpus, i dunno without seeing it all laid out.
                    I think because by the time my brain had gone around a few times I was thinking about balancing the PSU's and not putting all 7 GPU cards on one PSU and just CPU and peripherals on the 2nd PSU but splitting 5 +2, or 6 +1 etc.


                    Jez

                    ------------------------------------
                    3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
                    Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

                    Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
                    ---- Updated 06/09/23 -------

                    Comment


                    • #25
                      you can still have the gpus powered by the motherboard psu even if they are on risers, and vice versa..

                      Comment


                      • #26
                        Originally posted by super gnu View Post
                        you can still have the gpus powered by the motherboard psu even if they are on risers, and vice versa..
                        Thanks gnu, I'm going to do that. In fact, I'm designing my frame as we 'speak' - I've decided to put it together myself just start by moving the 4 cards I do currently have off and away from the mobo using risers (I'll continue running off my existing single 1500w psu as I know that all currently works safely)

                        At least then there'll be some nice space and better airflow between my gpus.

                        Then as my knowledge and experience kicks in, I can add more GPUs and the extra PSU when needed.

                        Risers should be coming tomorrow - I've ordered 4 of these..... https://www.overclockers.co.uk/kolin...cb-006-kk.html

                        There are Aluminium tubing companies online where you can order and have tubes cut ready to size - the tube connectors are provided as well - my design is going to be based on something like this;
                        Last edited by JezUK; 06-07-2017, 06:02 AM.
                        Jez

                        ------------------------------------
                        3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
                        Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

                        Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
                        ---- Updated 06/09/23 -------

                        Comment


                        • #27
                          Originally posted by biochemical_animations View Post
                          consider a kill-a-watt meter...$20 from amazon to test wattage pull from the outlet...test on full gpu production test render to get a measured value, then add cards and psu's as you see fit
                          https://www.amazon.com/dp/B00009MDBU...a-306572288073
                          Thanks Bio,

                          I will get one of those - though I do also have Corsair Link Command hardware which reports the power in and out from my Corsair PSU. But I think you are right, for the small outlay, it'd be best to buy that little bit of kit to confirm what the software is claiming.....

                          Originally posted by biochemical_animations View Post
                          btw, I totally love have the immediate feedback using active shade when creating materials and setup lighting, camer effects. The production test renders at like 1600x1200 take a 2-5 minutes and 4k 30-45 minutes...a must have for animations

                          Yes, I couldn't agree with you more, I love that too and it's hard for me to now contemplate going back to CPU rendering (the last project I did for a client was the first time I did GPU rendering in production and renders were taking 3-6 minutes. The same, I tested once the project was completed, using my 5960x, took around 20-25 mins each.....)
                          Last edited by JezUK; 06-07-2017, 06:08 AM.
                          Jez

                          ------------------------------------
                          3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
                          Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

                          Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
                          ---- Updated 06/09/23 -------

                          Comment


                          • #28
                            Originally posted by biochemical_animations View Post
                            So, I've had a bit of fun the last few weeks on my new ws build that I thought I'd share.

                            Power was a bit of a challenge. The key, I believe from what I read and tested, is to isolate the graphics cards power not plugged into the MB. I have two cards in the MB running off a main evga g2 1000w psu powering the MB and CPUs. I went with powered pcie riser ribbons 8x-16x that are also powered by the main g2 psu. An extra evga g2 1000w psu powers 3 cards, an 750w psu powers 2 more cards.

                            The extra psu's are always on via the tester tool, basically just a jumper, then turn on the main psu to powerup the system. I got psus that have an eco mode smart fan, so the fans only turn on when they need to. I measured the extra psu's when on and the system off via a kill-a-watt meter pull only 15 watts...so I leave them on all tne time.
                            Hi Bio,

                            Please, I have a question.

                            Have I understood your powering up procedure correctly - you power on the 5 GPUs via their two PSU's FIRST (they are connected via the Kill-a-Watt meter) and once that is done, you then power the computer (Main MB and other 2 GPU's) via the on button of the computer (and you do that last) ?

                            I am trying to understand the startup procedures when using more than one PSU.

                            When you say "just a jumper" are you saying there is a button on the kill-a-watt meter that acts as your on/off button for those 2 PSU's ?

                            Many thanks.

                            Jez

                            ------------------------------------
                            3DS Max 2023.3.4 | V-Ray 6.10.08 | Phoenix FD 4.40.00 | PD Player 64 1.0.7.32 | Forest Pack Pro 8.2.2 | RailClone 6.1.3
                            Windows 11 Pro 22H2 | NVidia Drivers 535.98 (Game Drivers)

                            Asus X299 Sage (Bios 4001), i9-7980xe, 128Gb, 1TB m.2 OS, 2 x NVidia RTX 3090 FE
                            ---- Updated 06/09/23 -------

                            Comment


                            • #29
                              Originally posted by JezUK View Post

                              Hi Bio,

                              Please, I have a question.

                              Have I understood your powering up procedure correctly - you power on the 5 GPUs via their two PSU's FIRST (they are connected via the Kill-a-Watt meter) and once that is done, you then power the computer (Main MB and other 2 GPU's) via the on button of the computer (and you do that last) ?

                              I am trying to understand the startup procedures when using more than one PSU.

                              When you say "just a jumper" are you saying there is a button on the kill-a-watt meter that acts as your on/off button for those 2 PSU's ?

                              Many thanks.
                              Hey Jez.

                              Yes, I power the 5 GPUs via their psu's first, then power the main mb psu (and other 2 gpu's) last to power everything up. All gpu's need to be powered before the mb bios turns on so they can all be detected by the bios. If you do it any different, the bios might turn on first and not see some gpus and those gpus will not be detected by the bios and or windows.

                              My power supplies came with a free self test thing that allows one to test the psu before connecting it to the computer to see if it works or is a dead psu.

                              https://linustechtips.com/main/topic...-power-supply/

                              ...it goes on the mb connector and is simply a jumper wire connecting a couple prongs on the connector telling the psu to turn on and supply power. Once the self test tool is connected, you switch the psu on, and the psu turns on (supplying power to the rails or cables). I leave my tester tool plugged into my psu all the time and leave my psu's switched on all the time. That way I don't have to switch on the 2 psus that power the 5 gpus each time I turn the computer on.

                              The gpus don't pull power when the mb psu is turned off...basically everything is off, just like the main psu, when the computer is off.

                              The kill-a-watt meter is currently not plugged into anything. I only plug it in when I need to take a wattage/amper reading. Then I leave it unplugged. I only used the meter briefly to see how much wattage the two psu's that powered the 5 gpus were using when the computer was off and the psu's were switched on with the seft test thing plugged in...it was like 75watts or something really low, so I just leave those psu's on all the time with the self tester tool plugged into each.

                              Let me know if you still have any questions...hopefully this helped.
                              Last edited by biochemical_animations; 10-07-2017, 02:14 PM. Reason: typo

                              Comment


                              • #30
                                Originally posted by JezUK View Post
                                I am trying to understand the startup procedures when using more than one PSU.
                                I had an "oh crap" moment when initially plugging things in...I should probably explain a bit more. My risers are powered, and I think yours are too via a molex or sata power connector.

                                I first powered my risers using the 2 psus that powered my 5 gpus and when I went to turn on the computer, these 2 psus turned off...like something went wrong.

                                Luckily, my psu's had an overload protection and shut off before something got fried.

                                After they cooled down a while, I powered the risers with the psu that powers the mb and this all worked. So all my risers are powered by the psu that powers the mb. Also, the 2 gpus that are plugged dirrectly into (not via riser) the mb are powered by the mb psu.

                                The 5 gpus are powered by the 2 extra psu...but not their risers. No riser is powered by these 2 psu's. All risers are powered by the mb psu.

                                This was my experience that I could not find anything on the web about...hopefully it helps.

                                Comment

                                Working...
                                X