Announcement

Collapse
No announcement yet.

Small Render Setup: 2 or 3 workstations vs 1 big server

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Small Render Setup: 2 or 3 workstations vs 1 big server

    Looking at a new render setup and appreciate any advice. I came up with these scenarios (based on DIY component build cost and published Vray scores):
    Option Configuration Cost (ea.) Vsamples (ea.) Qty. Cost (Total) Vsamples (Total)
    A Server: Dual AMD Epyc 9755 128-core, 1.124TB RAM $35,000 354,400 1 $35,000 354,400
    B Workstations: AMD Threadripper Pro 7995WX 96-Core 256GB RAM $17,000 188,200 2 $34,000 376,400
    C Workstations: AMD Threadripper 7980X 64-Core, 256GB RAM $11,000 130,200 3 $33,000 390,600













    Here's how I see it:

    Option A
    • Most power efficient / lowest power bill over time
    • Least administration (just one node)
    • Highest scaling density (3X more power in a rack)
    • Highest up-front cost
    • Lowest aggregate performance
    • No redundancy (one node goes down and I'm incapacitated)
    • Dedicated system / not practical for swinging to workstation use

    Option B
    • Somewhat between A & C, but more like C just costlier​

    Option C
    • Least up-front cost
    • Highest aggregate performance
    • Most redundancy (three nodes, so can limp by on two if one goes down)
    • Easier self-support (can transplant parts from working system to troubleshoot hardware issues)
    • Flexibility of potential to be used as workstations part-time
    • Less power efficient / higher power bill over time
    • More administration (multiple nodes vs one big node)
    • Low scaling density (takes 3X physical space to achieve same performance as Option A)
    Did I miss something? I'm leaning towards Option B or C because of the redundancy and flexibility, as well as the fact I can more granularly add performance. For my uses, I'd actually be more comfortable doing a double order of one of these options to get the render times I'd like.

    PS: I know the RAM isn't equal - I could go to 512GB RAM in the workstations for ~$1600 more each and I probably would. 256 is enough for what I've been doing, but nice to have some cushion. The larger amount in the Epyc server is just to fully populate all the channels for performance and seems like the thing to do with so many cores.


    Attached Files

  • #2
    I would go option B or C, preferably option B
    There are many 7995wx CPUS outhere, it will be easy to find solutions for any potential issues on the software or the hardware side. Plus all components for this setup will be consumer components.
    Dual socket, you will most likely run into more issues that are not easy to solve considering how rare this setup is. We don't have anything like this here in the Sofia office, while we have multiple Threadripper workstations. Plus, most of the components are server grade.

    Best,
    Muhammed
    Muhammed Hamed
    V-Ray GPU product specialist


    chaos.com

    Comment


    • #3
      Thanks Muhammed. I think I am going that direction.

      My biggest decision now is how much memory per node. The scenes I've been working on have maxed out around 128GB memory usage. That said, I'm heading into some 4K film projects that will be more ambitious and I just don't know how much more memory those scenes are going to need. I'll definitely go with at least 256GB per node, but am considering 512GB - I just don't know if that'll end up being a waste.

      I know that's very vague but wonder if anyone knows any comparables out there (i.e. what big studios think is enough for their demanding scenes) or anecdotes to help guide me on that...

      Comment


      • #4
        I would say 256 GB will suffice(needs to be setup in eight-channel configuration, so 8 memory sticks per machine at least otherwise you will lose a lot of performance), if you run into memory limitations there has to be room for optimization

        For reference Accenture rendered this sequence from Avatar: The Last Airbender using nodes with 128 GB system memory, and 256 GB system memory
        They used the Threadripper 32-core 3970X and V-Ray for Houdini, around 30 minutes render per frame in 4k. The city shot took around 200 GB of system memory

        It would be interesting to see anything that takes more than 256 GB of system memory

        Best,
        Muhammed
        Muhammed Hamed
        V-Ray GPU product specialist


        chaos.com

        Comment


        • #5
          I agree with what Muhammed said. We have main workstations as threadrippers 64 core and 256 GB of ram and in 99% of the cases the ram never really goes up to more than 200GB. If it does you have to be asking your self another question like what did I do wrong here ) Unless you are rendering an epic asset which is rarely the case. Upgrading our farm to all machines having minimum of 128GB and half of the farm having 256GB makes most if not all our projects never run out of ram.

          About epyc cpus, I know personally one person who bought a system like that and it turned out houdini for example had an internal core limitation of 380 threads (don't quote me on actual number) but it was below what dual epycs could do. There were many other stability issues with that system as well.

          I have a third gen threadripper 64 core and my colleague has threadripper 64 pro, and after working on those for 5 years they are still performing really really well, to a point where I don't really see the need to drop another $20k on a new system. Only thing I have upgraded was the ram, you need to be aware that threadripper does need higher clock ram for best performance and I have 8 32GB sticks that run at 3200Mhz natively without overclocking, if you can get ram that runs at 4000Mhz even better. I have also upgraded the video card about 3 times in last 5 years, I would suggest not to buy quadros or any other highend cards as they cost x3 more while offer very little back compared to geforce. The only beef we have with 4090 is that they only have 24GB of Vram, and we often run out of that during houdini sims and we wish they offered some that were 48GB!
          Dmitry Vinnik
          Silhouette Images Inc.
          ShowReel:
          https://www.youtube.com/watch?v=qxSJlvSwAhA
          https://www.linkedin.com/in/dmitry-v...-identity-name

          Comment


          • #6
            Also for other things you have mentioned. Redundancy is an interesting topic. What epyc systems are designed to do is to run virtual machines. That's their sole purpose. They are meant to be in a rack running 10-20 vms. In that case there is no issue if your vm breaks. But if you are looking for budget redundancy then don't omit buying a smaller less powerful but yet capable system. I have a amd ryzen 9 system with 128GB of ram. it cost me about $6000, and it's a good farm contributor but also can be jumped on if anything else failed.

            For lower end cpus as main workstation though I have a mindset that as a primary work tool you have to have the best of the best, which will allow you to do the work fastest way possible. User time is much more expensive and much more valuable then machine time. If I have a fast machine that lets me get the job done in x3 times faster, I can do more work, or work less with the same amount of pay.

            We don't really care about power costs, they are what they are - part of business. If you are dropping $35k on workstations, saving small cost on power is minimal compared to saving user time (what I said above).
            Dmitry Vinnik
            Silhouette Images Inc.
            ShowReel:
            https://www.youtube.com/watch?v=qxSJlvSwAhA
            https://www.linkedin.com/in/dmitry-v...-identity-name

            Comment


            • #7

              The only beef we have with 4090 is that they only have 24GB of Vram, and we often run out of that during houdini sims and we wish they offered some that were 48GB!​
              Morbid Angel It will most likely be 32 GB with the 5090, from the recent leaks. It will be great to have 32 GB now, we have been stuck to 24 GB since forever

              Best,
              Muhammed
              Muhammed Hamed
              V-Ray GPU product specialist


              chaos.com

              Comment


              • #8
                Muhammed that's great to hear, we tend to run sims in houdini using the gpu and easily run out of ram. I am afraid that even 32 GB although welcome increase is not enough for us but than again I don't know even if 48GB would be, I don't know how the vRam compares to regular Ram when things like fire / smoke sims are running on it.
                Dmitry Vinnik
                Silhouette Images Inc.
                ShowReel:
                https://www.youtube.com/watch?v=qxSJlvSwAhA
                https://www.linkedin.com/in/dmitry-v...-identity-name

                Comment


                • #9
                  Thanks to all for your responses. Seems like multiple Threadripper Pro 64 or 96-core workstations with 256GB memory each is a likely path for me (perhaps a mixture of core counts as some applications fair worse on >64-core CPUs). Probably more stable and better-supported than Epyc, can swing over to workstation use during daytime if needed, and more systems has some administration overhead (but is much the same as soon as you go to multiple nodes) but also is more redundant when a single node goes down.

                  I've been trying to get on board GPU rendering instead of CPU for production, but every time I foray into that it seems I hit a snag or major drawback. Everything I do is cinematic/photoreal, so I'm not confident yet that GPU will do what we need.

                  Comment


                  • #10
                    I just bought two new threadrippers 4th gen 7890x 64 core systems. I have to say that just running vray bench mark on this vs the 64 core threadripper 3990x yields almost double the amount of performance from 63000 samples to 120000 samples. In real world scenario the machine renders a frame in 12 minutes while 3990x renders in 20 minutes. This is not even the 96 core which is just so impressive!
                    Dmitry Vinnik
                    Silhouette Images Inc.
                    ShowReel:
                    https://www.youtube.com/watch?v=qxSJlvSwAhA
                    https://www.linkedin.com/in/dmitry-v...-identity-name

                    Comment


                    • #11
                      Originally posted by Morbid Angel View Post
                      I just bought two new threadrippers 4th gen 7890x 64 core systems. I have to say that just running vray bench mark on this vs the 64 core threadripper 3990x yields almost double the amount of performance from 63000 samples to 120000 samples. In real world scenario the machine renders a frame in 12 minutes while 3990x renders in 20 minutes. This is not even the 96 core which is just so impressive!
                      You mean 7980X, right? Thanks for this info - it's exactly relevant to me as I am also upgrading from 3995WX which gets about 60K Vray marks.

                      Comment


                      • #12
                        Sorry it was a typo, yes that's the one!
                        Dmitry Vinnik
                        Silhouette Images Inc.
                        ShowReel:
                        https://www.youtube.com/watch?v=qxSJlvSwAhA
                        https://www.linkedin.com/in/dmitry-v...-identity-name

                        Comment


                        • #13
                          Been running 7995wx with 512GB since it came out. I almost never use all that RAM. I have exceeded 256GB, but very rarely. The machine is fantastic for VRay.

                          We render mainly with internally custom built 7950x and 9950x nodes. They do take up more space, but everything is off the shelf, and are cheaper overall, even with the extra VRay licenses. When one is having issues it’s not missed much due to it being such a small portion of the overall farm. Not so with fewer, server level nodes. They do use more power than the epycs, but again, still far cheaper over their life. Not as good for the planet, I suppose. Also, these nodes have much higher per core performance than Epyc, and are thus also good for simulations, and things like Nuke, which is very old code and poorly threaded.

                          Comment


                          • #14
                            Originally posted by Joelaff View Post
                            Been running 7995wx with 512GB since it came out. I almost never use all that RAM. I have exceeded 256GB, but very rarely. The machine is fantastic for VRay.
                            Thanks for that info. I have common scenes using 200+GB, but never have gone over 256 to my knowledge. That said, I might go 512 just so I don't have to worry about it if I get a heavier scene going.

                            I'm trying to decide between going 64-core or 96-core (or a mix). By chance do you have a Vray score with latest version on the 7995wx system? Do you use Windows or Linux for renders? Any problems with some applications not liking >64 cores in Windows?

                            Comment


                            • #15
                              I haven’t run the benchmarks recently. Some apps will only use 64 threads (which applies to the 64core128thread machines as well).

                              Simulations vary quite a bit, where some perform better with fewer faster threads. Depends on the software, and particular uses. I have seen some sims perform faster on 7950x machines, really just depends on the scene.

                              Where the 7995wx really shines is in render feedback for IPR, and hard core rendering. It’s great for lighting and shading.

                              Running windows, since Max is windows. Do some rendering with standalone on Linux as well, which I much prefer for a host of reasons. Wish I could use Linux for everything. Standalone is a pain in the butt with Max, though, compared to simply rendering in Max.

                              You’ll need a beefy UPS, especially if you want to run PBO. With PBO on the 7995wx draws close to 1200W from the wall when rendering.

                              Comment

                              Working...
                              X