Announcement

Collapse
No announcement yet.

homemade small RENDER FARM for DR - what is the best solution?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • homemade small RENDER FARM for DR - what is the best solution?

    Hi guys,

    I am about to buy a new comp: i-7 3930K, Windows 7 (more details are not that relevant in this issue I guess). Moreover I want to setup my cheap small homemade render farm. I assume: 90% use for Distributed rendering (still images) and 10% use for Backburner (animations).

    My budget is about: 3000 euro.

    So ... any tips what would the best solution for hardware (motherboard, RAM etc.)? I thouht about 2-3 of i-7 3930K with some cheap hardrive, motherboard and descent DDRAM (32GB).

    Anyway I would really appreciate your advise if you have any experience with such an issue!
    Thanks!
    i-9 7980XE at stock, G. Skill RipjawsV 64GB RED, MSI GeForce GTX 1080 Ti GAMING X 11GB, http://fractalmind.eu

  • #2
    Heya

    You can get 3x i7 3930K. I got myself this case http://4.bp.blogspot.com/__x_DcUJmEV.../OPERA-350.jpg and with Asrock extreme 4m motherboard and 32 gb of ram. Also got a tiny PSU 400-450W or so with it(Not ATX format... some small format one ). On top of it H-80 cooler and all nicely wrapped.

    You can stakck them on top of each other and they are pretty nice self contained.

    The only modding I had to do is to drill out all the stuff outside Case so that I could put my parts where I wanted to.

    Rig cost me around 850£ 6 months ago.

    You should be able to build 3 of them and stack on top of each other
    CGI - Freelancer - Available for work

    www.dariuszmakowski.com - come and look

    Comment


    • #3
      I would go with much more powerful power supplies. In constant use a power supply can easily loose 30% of its output after one year. I wouldn't use less than 750 watts. We have burned out 550 watt power supplies in or render farm. Nothing over 650 has burned out. Spring for the 90plus Gold efficiency if you really plan on using your farm. This will save you on electricity later (never did the math to see if it would actually save you money, but it will reduce your carbon footprint, assuming it doesn't take more energy to MAKE the more efficient supplies.) Note these are all ATX power supplies. We use 4U servers with ATX power supplies because they are easy to replace, easy to find, and last longer than smaller PSUs in smaller cases. Yes, this takes up more rack space, but we have the physical space.

      We have been using the Sabertooth boards from Asus. They have been reliable (as have the P6T deluxe boards). These machines have seen a lot of use. They are overclocked, and they run hot.. In other words, they are abused.. The only thing that ever failed was power supplies (550W Thermaltakes to be precise, burned three or four of those in a year).

      Cooling is critical, and will also be one of your biggest expenses power-wise. If this is a home setup I would consider putting them in the basement, but ventilation is important. These will generate a lot of heat when rendering.

      I would use Windows 8, not 7.. Why? I hate 8 as much as most people, but it will be supported longer. Microsoft learned their lesson in not making profit with XP. They supported it WAY too long, and I predict they will start retiring OSes much faster. Why take the time to install and set up an OS that is already obsolete? Autodesk just broke XP64 for 2014 SP1.

      The newer OS will last much longer. With render nodes you want to keep them as long as you can (until their power usage doesn't justify using them any more as chips become more efficient). We had to upgrade some XP64 machines to work with 2014 Sp1. They were many years old, but still powerful (these were former workstations that had migrated into the farm). The point is that the hardware may still be in your farm by the time the OS is no longer supported by Microsoft or AutoDesk. Definitely best to go with the latest OS now. Once you get past the "Why the hell does it work like that!?" and "Where the hell is feature X?!?" with Windows 8 it is actually pretty nice. And 8.1 is coming soon as a free upgrade to at least bring a Start button back (not sure if there will be a Start MENU, which is what is needed).

      If possible make all the machines the same. A homogenous farm is very helpful when things break. It makes it easy to diagnose by swapping parts out.

      We keep a spare power supply on hand at all times. I would suggest the same. These are the things that fail...

      Consider Western Digital green drives for their lower power usage. With three machines it may not be a big deal, but once you have 10, 20, 100 nodes it becomes a big deal.

      Good luck!

      Comment


      • #4
        Hi Guys,

        @Joelaff
        Great to hear the tips from somebody with such a great experience!
        Actually I was pretty convinced to WINDOWS 7 as I am using now (2) i-7 2600K both with Windows 7. And I want them to be a part of my mini-farm soon with (3) i-7 3930K.
        Need to think it over before take a decision.

        @DADAL
        Thanks for tips!
        How many nodes do you use in your farm? Do you use it for DR?


        Here is my current set:

        CPU: Core i7-3930K
        MB: MSI X79A-GD45 (8D) Intel X79 LGA 2011
        RAM: Kingston HyperX 2x8GB DDR3-1600 Dual Chanel Kit Non-ECC CL10 Red KHX16C10B1RK2/16X
        GPU: VGAMISNVD0196
        HDD: WD Green WD10EZRX 1TB sATA III 64MB
        COLLER: Noctua NH-D14 SE2011
        PSU: OCZ ModXStream Pro OCZ600MXSP-EU 600W

        What do you think?
        i-9 7980XE at stock, G. Skill RipjawsV 64GB RED, MSI GeForce GTX 1080 Ti GAMING X 11GB, http://fractalmind.eu

        Comment


        • #5
          get 32 gb of ram -cost as much as 16 in some cases.

          Get SSD 120gb SSD -it will cost as much as ur 1TB HDD
          Dont get VGA, well get 1 to install OS then unplug it.
          Coller - maybe water? its more convenient and easy...
          PSU up to you.
          What case?

          The good think about HTPC format cases is that u can stack them on top of each other. I could send you some pics of my rig if you want.

          I have aroun 3 slaves... but they different grade and age so you know

          I only need slave for rendering and that usually takes 2h a day so it don't run 24/7...

          Thanks, bye.
          CGI - Freelancer - Available for work

          www.dariuszmakowski.com - come and look

          Comment


          • #6
            Again I would think carefully about deploying Windows 7, as it is already obsolete. MS will come along with DirectX 12 (or whatever the next thing is) only for Window 8, and then AD won't support 7... This may not be next year, but it will be sooner than you think.

            8 behaves pretty much like 7. You just have to figure out how to get the desktop, which is pretty easy. Then you can do almost everything from there. I never use any of the app junk, or the charms, etc.

            BTW, we render on Vista, 7, and 8, (was Xp) and everything cuts together fine. There are not differences in output.
            Last edited by Joelaff; 10-07-2013, 11:29 AM.

            Comment


            • #7
              DADAL thx again.

              your advise about VGA just priceless the same for HDD.

              Regarding the case. I have really plenty of room at my office. Actually I can use one empty room for my small farm so there is no need to stack the nodes one on top of the other. I mean if CPU is going to overclocked then it is better to separate nodes due to heat issue. So I thought about some cheap midi-tower: Cyclone X-73-KKN1 Pure Black USB 3.0, SSD Ready.

              Cooller. Maybe I am getting into the world of fantasy, but is it possible to use 1 super-power water-coolling system for 3 nodes instead of 3 (to cut some costs)?
              i-9 7980XE at stock, G. Skill RipjawsV 64GB RED, MSI GeForce GTX 1080 Ti GAMING X 11GB, http://fractalmind.eu

              Comment


              • #8
                Joelaff thx for reply again!

                Do you use DR as well? If there is no problem with rendering using part of nodes with W7 and the otheres with W8 ... then I think I can go for Windows 8
                i-9 7980XE at stock, G. Skill RipjawsV 64GB RED, MSI GeForce GTX 1080 Ti GAMING X 11GB, http://fractalmind.eu

                Comment


                • #9
                  We use DR as well, but mainly with VRay. Have not seen any issues, though I am not sure any of the DR machines are Windows 8 come to think of it.

                  Warning about the single big water cooler... This is a great idea from an efficiency standpoint. I have considered it many times... But you are setting yourself up for failure because you have now taken the redundancy and power of a render farm and introduced a single point of failure that kills your whole farm dead in the water. The beauty of a farm is that if one or two machines start behaving flaky in the middle of a job its no big deal... But if they all rely on the same water cooler and that fails!!!

                  Bad news.

                  Comment


                  • #10
                    Yes, this is something that I also thought about one water cooler ... it can disaster when it fails.
                    i-9 7980XE at stock, G. Skill RipjawsV 64GB RED, MSI GeForce GTX 1080 Ti GAMING X 11GB, http://fractalmind.eu

                    Comment


                    • #11
                      You have 1 water cooler per machine. Its closed loop build in by manufacturer. You dont put water in or anything. Its faster to mount and replace than fan based cooler. The water is not water but special liquid that dont transmit electricity so if there is spill then nothing will be damaged...

                      Thanks, bye.
                      CGI - Freelancer - Available for work

                      www.dariuszmakowski.com - come and look

                      Comment


                      • #12
                        One cooler per machine works fine, if you want to spend that much. This doesn't buy you anything in terms of server room cooling, unless you route the water form each node to an outside radiator. It will let you possibly overclock each node more (generating even more heat for the server room).

                        We have been using H100 coolers and the like, which are the factory sealed "water" coolers. These work well. Ideal, I guess, would be to have each machine have its own loop that goes to an outside radiator. There exist rear doors for server racks that have water systems in them. They capture a lot of the heat coming out the back of the machines and route it to an outside radiator. Those are still pretty pricey.

                        Three machines is not too much of a problem, but you get over 20 and cooling will become a really big headache. I would love to go full immersion like a Cray!

                        Comment

                        Working...
                        X