Announcement

Collapse
No announcement yet.

Hardware Testing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Hardware Testing

    Hi All,

    I wanted to start a conversation about Hardware. I know it a broad subject, but I'm always curious how other users systems are set up, maybe talk about networking issues or ways to best develop networks of machines. Just anything about hardware for rendering

    I'll start it off with talking about my system and network, and by saying I am not trained in any kind of IT and I'm sure I'm doing some things wrong. So, Work rig is a i7 6950x with 64gb of ram and dual 1080ti's, that is hooked up to a network of machines around the house (my wife loves it ) over a few Netgear 10/100Mbps switches. The network equates to roughly 88 threads overall and is expanding.

    Now, I'm looking at new hardware and this is mainly what I'm interested in. I was wondering if anybody had any benchmarks for the i9 7900x so far. I searched the benchmark page and couldn't find anything. I see some testing for EPYC on the benchmark but I'm curious how the new i9 7900x holds up compared to the i7 6950x and if anybody has one they could run the benchmark with.

    Anyway, share your systems and share your thoughts on hardware!


  • #2
    Hi Ted.

    It seems that you have a Render Farm in your house , it looks awesome!. I don't think that you need to make an upgrade to i9 7900x, the core speed between i9 7900x and i7 6950x almost the same unless you make an overclocking for it, so it could be 1.2x faster than i7 6950x , if you look for real upgrade you could go to Core i9-7980XE: 18-core/36-thread. Cheers !
    Last edited by Mousa_SA; 11-07-2017, 01:04 PM.
    Regards,
    Mousa Abu Doush
    Architect | 3D Artist
    www.sketchuparchive.com

    Comment


    • #3
      I completely agree! However, I'm actually building a computer for another team member rather than for my self.

      Comment


      • #4
        Okay. I think you have made a good choice !, i9 x7900 have great price and great efficiency if we compare it with the other processors
        Regards,
        Mousa Abu Doush
        Architect | 3D Artist
        www.sketchuparchive.com

        Comment


        • #5
          .... I'm not actually sure thats true. For various reasons, including the fact that the chipset was poorly designed (reports of vrm's overheating) AND the fact that the chipset itself is confusing (quad channel? dual channel? yup and yup!), I would stay away from x299. Intel rushed to get this out to compete with Ryzen, Epyc and it seems half baked. From a cost perspective, it doesn't make a ton of sense either DEPENDING on what your doing.

          The question is: how do you use the computers? Mostly CPU rendering? Other single threaded tasks? GPU rendering? How many cards (probably enough pcie lanes no matter what, but worth considering). Its important to take into account the holistic cost of the platform when configuring desktop computers (mobo, ram, 10gbe onboard or if you need a card, etc).

          Now, if your are configuring an actual rendering farm, then its slightly different: with the licensing costs, you want to put as much compute power into a single rig as you can afford so you aren't managing and licensing 24 nodes.

          btw, check out some benchmarks here

          Comment


          • #6
            Good point about the build cost being high even if the CPU itself may be competitive. For a CPU rendering node, using dual CPU server setups can be effective when you start counting in software & licensing costs. I've been slowly acquiring Boxx RenderPros for my local farm (big end of the year expense for a good tax write-off.) Unfortunately I've not been able to run swarm on them for whatever reason, and at some point will see if the latest updates fix my issues.

            Comment


            • #7
              Yeah, The x299 Mobo's have been my biggest concern here. I'm looking at the x299 platform strictly from a workstation upgrade point of view. Since it's at the beginning of its life, the chance that I can upgrade later is great and I need a well-rounded system with the most expand-ability and ram possible, hence the i9 7900x. That said I'm concerned about the gen 1 motherboards since things were rushed I'd hoped somebody had personal experience with the system. This particular build may happen in the next couple weeks and I can't wait on Threadripper really, though that may be the next set of render-farm machines I'll be investing in.

              Comment


              • #8
                To follow up on this, we have a pretty good collection of equipment in the studio:

                -Main render farm: dual xeon 2690 v4 (14 physical cores each running 2.6ghz) with 128gb ram (and 1050ti for display and denoising). 3877 cinebench, ~30s vray bench
                -Main workstations: 6700k (4 physical cores around 4ghz) with 32gb ram and GTX 970ti - ~750 cinebench, 2m10s vray bench

                At home I have a ryzen 1800x that comes in around 1500 cinebench and 1m15s vray bench

                At roughly half the price of intel's current offering, I just don't think there is a reason to not get an AMD setup right now for workstations (or small/cheap render farms). They are just so good and offer enough pcie lanes for everything you need. They won't offer anything xpoint/optane related, but honestly, I just don't know what the point of that is outside server environments (as of right now).

                I've always been more tech and operations focused, so some things I've learned that may help people just getting started off building some rigs or farms:
                1. keep it homogenous. unless you want to setup deadline to do path conversion or are really good with linux, keep it the same OS
                2. separate out rendering from file storage from license servers. I have a dumb computer that acts as one fork of our local backup and also takes care of any license duty (and our mongoDB for deadline). We have our computer rackmount and we have our NAS loaded up with SSD's for storage. Basically, everything is a uni-tasker and built to do a specific job very fast
                3. if you are just now starting out, invest in 10gbe. Asus and others are starting to release those aquantia based cards for around 100 bucks, switches are coming and anything cat5e on up works for 10gbe (I believe, will have to check spec. I know cat5e doesn't work over greater distances like cat6, but I believe it does work to some extent).
                4. figure out what and how your rendering - this applies to all render engines. GPU? Do your materials work in a gpu environment? Do your scenes fit on vram? If not, your CPU. Some engines like CUDA, but for absolute compute power, Vega is and probably will be hard to beat (maybe volta?) Your engine might do OK using openCL and running an AMD card
                5. don't overlook offloading to rebus and others. Their plugin works well in SU, and vrscene files are mostly universal at this point, meaning you could render almost to any farm with a few workarounds

                Comment


                • #9
                  Originally posted by ted.vitale.jdk View Post
                  Yeah, The x299 Mobo's have been my biggest concern here. I'm looking at the x299 platform strictly from a workstation upgrade point of view. Since it's at the beginning of its life, the chance that I can upgrade later is great and I need a well-rounded system with the most expand-ability and ram possible, hence the i9 7900x. That said I'm concerned about the gen 1 motherboards since things were rushed I'd hoped somebody had personal experience with the system. This particular build may happen in the next couple weeks and I can't wait on Threadripper really, though that may be the next set of render-farm machines I'll be investing in.
                  I'm not sure. I'm not super familiar with intels HEDT lineup, but from what I recall, those sockets and chipsets are one and done (ie no future releases). Also, remember this is near the end of the kaby/skylake cycle (I think coffeelake is later this year?), so I honestly don't know if there is any life in this platform. But hey, thats OK. With a fast tick-tock cycle now, and a workstation rig life span of 24-36 months, I think you buy once, and then completely replace mobo/cpu/ram in 24-36 months (or replace the whole machine and add the old one to your growing farm!)

                  Comment


                  • #10
                    delineator I was hoping you would add to this thread! I think you really brought up a lot of great points! As of right now the specific machine I'm looking to build is essentially for the head of the company (finally got him away from Mac hardware) But he's got a really solid budget and no farm to work with at the moment so this would be the start of the future farm/long term workstation. I think the most telling portion of your response is "those sockets and chipsets are one and done (ie no future releases). Also, remember this is near the end of the kaby/skylake cycle (I think coffeelake is later this year?)" I hadn't really thought about that, I was looking at possible upgrades to the future i9 18 core processors, but I think you've brought up a very important point about the next cycle. I was specifically thinking specifically about socket type.

                    I'm definitely going to need to look into Rebus and Deadline. Right now I work remotely and I'm going to start working on a small server cabinet to house my 3 machines making up the 88 cores. I'm using and converting all my boxes to 2-4u rack mount boxes. When I had the builder wire up the house I had him wire it up cat6 throughout so I get decent speeds too . My goal is to make it so that I can scale the office up quickly when I need too though.

                    As far as the 10gbe NICs are concerned, I've thought about investing in a couple, but I wasn't sure if it was worth the investment, I always assumed it was more for enterprise machines, but I suppose the work we do justifies. Anything special I need to do the get them up and running?

                    Again, thanks for the awesome point of view. I think I may rethink the new machine I'm building after this and that I'm on the right track for the local farm.

                    Comment


                    • #11
                      Originally posted by ted.vitale.jdk View Post
                      delineatorI'm definitely going to need to look into Rebus and Deadline. Right now I work remotely and I'm going to start working on a small server cabinet to house my 3 machines making up the 88 cores. I'm using and converting all my boxes to 2-4u rack mount boxes. When I had the builder wire up the house I had him wire it up cat6 throughout so I get decent speeds too . My goal is to make it so that I can scale the office up quickly when I need too though.
                      4u is definitely great. We have been migrating that direction as well, just for ergonomic reasons (it sucks having a ton of crappy HP or Dell cases about). Plus power and network wiring is just all right there, and much easier to control. Frankly, doesn't have to be that expensive either. For anyone interested, we are running 2x 4U case (see link) off this rolling rack (see link). Worked out fine. Even transferred my license/database box to a 4U as well so everything is started to match.
                      https://www.newegg.com/Product/Produ...82E16816215002
                      https://www.newegg.com/Product/Produ...82E16811165411


                      Regarding 10gbe, its definitely helpful when transferring files (especially those exr and vrimg at the tale end!). During job submission, your probably not bottlenecked because its a bunch of little files and info being sent, but saving back to a nas or another location definitely helps. PCPer just did a brief review of the asus cards (https://www.pcper.com/reviews/Genera...thernet-Masses). Basically, totally worth it if your going machine directly to machine, because switches are still expensive. Off the top of my head, I think d-link makes an 8 port 10gbe switch for about $700, which just might be worth it, but I'm hoping for something in the 4x port variety for around $300 (asus has a 2x port for $200 I believe).

                      Of course this is all assuming your storage system can take advantage. If your spinning iron still (and no RAID array) then it may not make much of a difference. However, even off 4TB WD Reds via synology's hybrid raid, I think I've seen 300MB/s (will have to confirm, also this has some flash based cache).


                      From an operations standpoint, I would love to start sharing my experiences and journey getting Deadline and Rebus to work into our flows. I'm just not sure the best venue or format for that, as its quite a long story to tell (and ever evolving!)





                      Comment


                      • #12
                        ok dumb question, how do you @ a user? ted.vitale.jdk Am I doing the internets right?

                        Anywho, may want to hold out JUST a bit. Threadripper rips: https://www.pcper.com/reviews/Proces...erformance-999

                        at $1000, the 1950x is well, just awesome. Honestly, our dual xeon 2690 v4 rig cost well into the $8,000 (fully assembled from Puget Systems). However, I would be willing to bet you could build out a threadripper probably closer to $3000 for the same (ECC ram, etc etc). Staggering, just staggering.



                        Click image for larger version

Name:	cb15-2.jpg
Views:	93
Size:	95.8 KB
ID:	960243

                        Comment


                        • #13
                          I literally just watched AMD's youtube video about this! I think I would be better off buying a Ryzen gaming PC if we need something tomorrow as a stop-gap between now and August! Man, I hope they have enough to meet demand

                          Comment


                          • #14
                            Also, you are interneting correctly delineator

                            Comment


                            • #15
                              Not a bad idea, for $500, the 1800x Ryzen chip is probably the cheapest way to get to around 1500 cinebenches. Plus it takes ECC (if thats your bag) and you can find B350 motherboards hovering just around $100. If your super frugal you could build a beast of a render machine for not much more than $1000

                              Comment

                              Working...
                              X