Announcement

Collapse
No announcement yet.

AMD drops the mic. Releases a 32-core/64 thread Threadripper 2 CPU. GPU rendering still viable/justified?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD drops the mic. Releases a 32-core/64 thread Threadripper 2 CPU. GPU rendering still viable/justified?

    So now that we have a 32 core / 64 thread CPU confirmed and coming out in the next couple of months, is there a justification for investing in GPU rendering? Especially with the current GPU prices and availability and the fact that GPUs still don't support all CPU features and are usually limited by their onboard RAM. I read that the 16 core 1950X is on par with GTX 1080ti in terms of rendering speed so the 32 core Threadripper 2 would be about twice as powerful as a GTX 1080ti with none of the downsides of GPU rendering. Is my reasoning correct?

    So with this trend, any reason to go the GPU route if one gets 2-3 of these 32 core CPUs? On top of that, rumor has it next year AMD will be releasing a 64 core / 128 thread CPU! That's insane if you ask me.
    Last edited by Alex_M; 08-06-2018, 03:36 PM.
    Aleksandar Mitov
    www.renarvisuals.com
    office@renarvisuals.com

    3ds Max 2023.2.2 + Vray 7 Hotfix 1
    AMD Ryzen 9 9950X 16-core
    96GB DDR5
    GeForce RTX 3090 24GB + GPU Driver 566.14

  • #2
    Your concerns are valid imho. But consider it this way - next nvidia gpu will be much faster then 1080ti, and in fact some of the more higher end cards are a lot faster and have way more ram 32gb in some cases today (though way more expensive). I think its going to be a sort of race now. I agree that the gpu is "promised" to give us a lot more then it currently does and in fact I think its overrated.

    I personally can't justify investing into a gpu farm because of some of the issues currently present and it has been like that in last 5 years. So I kind of made a choice to stay with cpu and so far its been proven to be reasonable.

    You also have to factor in that on larger scenes no matter how fast your gpu is, some things just cannot be sped up, like initialization of the scene into memory, some of the processes are single threaded even on cpu, reading proxies from hdd, textures etc. So there may come a point where a scene renders in 5 min per frame on most powerful gpu setup where it takes 4 minutes to load and 1 minute to render. But if you have 10000 frame sequence no matter how you spin it, you would need a great number of those gpus individually to accomplish that animation even at 5 min per frame.

    Lastly I think vray gpu is improving a lot, but any time I open and try to render something there is always an issue of some sort. Just too unpredictable at the moment imho, to do a tight deadline project and have so many unknown factors.
    Dmitry Vinnik
    Silhouette Images Inc.
    ShowReel:
    https://www.youtube.com/watch?v=qxSJlvSwAhA
    https://www.linkedin.com/in/dmitry-v...-identity-name

    Comment


    • #3
      Originally posted by Morbid Angel View Post
      Lastly I think vray gpu is improving a lot, but any time I open and try to render something there is always an issue of some sort. Just too unpredictable at the moment imho, to do a tight deadline project and have so many unknown factors.
      So true.
      So sad.



      mekene

      Comment


      • #4
        Originally posted by theedge View Post
        So true.
        So sad.

        As always - if you start with V-Ray GPU from the start, it will be fine. Otherwise it is like switching engines during production.
        Otherwise - I am pretty confident that the GPU rendering future is bright.

        If you have scenes that you load slow and you send them over, we will fix it.

        Best,
        Blago.
        V-Ray fan.
        Looking busy around GPUs ...
        RTX ON

        Comment


        • #5
          What's most interesting to me is the direction major players are taking.
          Renderman is heading on the hybrid route with XPU and maybe some deferred loading for textures. Still curious if it just conforms to GPU result while using CPU to emulate GPU codes.
          Arnold is pushing really hard on making sure the GPU result is matching CPU. So using GPU for simpler scene for look-dev and bring CPU for the final frames if desired.
          V-Ray is that the results between GPU and CPU won't be the same, period. Depending on your scene, start with GPU or CPU and don't switch.
          I personally prefer GPU results matching CPU's so people can have the option to switch or throw all the power in a PC for rendering without worrying some feature limitations or the results would look different.
          always curious...

          Comment


          • #6
            RenderMan XPU is just like V-Ray GPU. They can run both on GPUs and CPUs, but result will never match RenderMan (in our case, V-Ray GPU will never match V-Ray).
            Autodesk said that they will not support all the features on Arnold GPU. I am not sure how this works with “it will look the same”.

            Best
            Blago.
            V-Ray fan.
            Looking busy around GPUs ...
            RTX ON

            Comment


            • #7
              Originally posted by savage309 View Post

              If you have scenes that you load slow and you send them over, we will fix it.

              Best,
              Blago.
              Blago the issue for me personally is not having the ability to load token variables which I use 99% of the time. So any scene I try with vray gpu immediately can't load textures and that is it. I also use al surf a lot and have to wait until that is supported.
              Dmitry Vinnik
              Silhouette Images Inc.
              ShowReel:
              https://www.youtube.com/watch?v=qxSJlvSwAhA
              https://www.linkedin.com/in/dmitry-v...-identity-name

              Comment


              • #8
                This is good to know. We are looking into those already.

                Best,
                Blago.
                V-Ray fan.
                Looking busy around GPUs ...
                RTX ON

                Comment


                • #9
                  Originally posted by savage309 View Post
                  RenderMan XPU is just like V-Ray GPU. They can run both on GPUs and CPUs, but result will never match RenderMan (in our case, V-Ray GPU will never match V-Ray).
                  Autodesk said that they will not support all the features on Arnold GPU. I am not sure how this works with “it will look the same”.

                  Best
                  Blago.
                  The impression I got after seeing their presentations is the goal is the results should perceptually match between GPU and CPU (not pixel-to-pixel accuracy). Will they ever get there with complex scenes? I don't know. They only showed some screenshots of single frame of character assets that looks very close. Maybe it's just the best scenario. I guess the pressure or the reasons behind that both (RMan and Arnold) are pushing really hard on matching GPU and CPU results is that lots of big vfx houses aren't gonna use GPU farms for final frames yet, so it's quite important for GPU-lookdeved assets look very close to CPU farm output, so there are less to worry about or troubleshoot down the stream.

                  Maybe Chaos is ahead of the game and already being there, done that, and knew how difficult it is to match between GPU and CPU results, hence suggesting to start the scene with either one and finish in that.

                  Last edited by jasonhuang1115; 09-06-2018, 10:04 PM.
                  always curious...

                  Comment


                  • #10
                    As I said already, RenderMan XPU (which is CPU & GPU render) is not trying to match RenderMan (which is CPU render).
                    It is trying to the extend where V-Ray GPU (which is CPU & GPU render) is matching V-Ray (which is CPU render).

                    Autodesk are saying that they will match the Arnold GPU to Arnold, but while at the same time they don’t plan to have all the features of Arnold in Arnold GPU. I am not completely sure how the matching works when they don’t support the same feature set.

                    Best,
                    Blago.
                    V-Ray fan.
                    Looking busy around GPUs ...
                    RTX ON

                    Comment


                    • #11
                      Blago you are correct of course in saying that using gpu or cpu is like using different engines. I also think the logic of using gpu for preview in case of arnold and cpu for production is flawed: how can you reasonably predict the final render time / look on cpu when using gpu? Once the tokens are supported I would def love to try and run some full production shots with gpu only but for me it will take serious resources to build a gpu farm maybe in time
                      Dmitry Vinnik
                      Silhouette Images Inc.
                      ShowReel:
                      https://www.youtube.com/watch?v=qxSJlvSwAhA
                      https://www.linkedin.com/in/dmitry-v...-identity-name

                      Comment


                      • #12
                        Maybe it's all marketing trick to get users hyped up after all. Apparently, both emphasizing a match between GPU and CPU. Would be interesting if they ended up telling users to forget about matching the two, then we know how much we are ahead. I agree that the two will probably never match especially in terms of doing lighting for full production scenes. On the other hand, whether it's flawed or not in terms of lookdev-ing smaller assets in GPU and be able to confidently assemble GPU-lookdeved assets in final lighting scene for CPU rendering, I am still very curious to see how it unfolds.

                        Click image for larger version

Name:	Rman CPU and GPU match.jpg
Views:	86
Size:	100.2 KB
ID:	999484

                        Click image for larger version

Name:	Arnold CPU and GPU match.jpg
Views:	76
Size:	111.4 KB
ID:	999483
                        Attached Files
                        always curious...

                        Comment


                        • #13
                          Leaked Cinebench results show AMD's Threadripper 2 2990WX 32-core/64-thread CPU score ~5100 CB in cinebench R15, almost double the performance over last year's Threadripper 1 1950X 16-core/32-thread CPU. Expected price - $1800 USD.



                          Interesting slide below. Looks like 2990WX is almost 50% faster in Vray than 7980XE while costing the same. Source - https://videocardz.com/newz/more-sli...riefing-emerge
                          Click image for larger version  Name:	AMD-Ryzen-Threadripper-2000-3-1.jpg Views:	1 Size:	183.3 KB ID:	1006219
                          Last edited by Alex_M; 07-08-2018, 06:22 AM.
                          Aleksandar Mitov
                          www.renarvisuals.com
                          office@renarvisuals.com

                          3ds Max 2023.2.2 + Vray 7 Hotfix 1
                          AMD Ryzen 9 9950X 16-core
                          96GB DDR5
                          GeForce RTX 3090 24GB + GPU Driver 566.14

                          Comment


                          • #14
                            More info. Its looking like a nice chip.
                            https://www.anandtech.com/show/13123...up-to-32-cores

                            Comment


                            • #15
                              Benchmarks are out. Looks like in Threadripper 2990WX is a monster. Exactly 2 times faster than 1950X and 40-45% faster than the 18-core 7980XE (which costs the same) in Vray.



                              Last edited by Alex_M; 13-08-2018, 04:32 PM.
                              Aleksandar Mitov
                              www.renarvisuals.com
                              office@renarvisuals.com

                              3ds Max 2023.2.2 + Vray 7 Hotfix 1
                              AMD Ryzen 9 9950X 16-core
                              96GB DDR5
                              GeForce RTX 3090 24GB + GPU Driver 566.14

                              Comment

                              Working...
                              X