Announcement

Collapse
No announcement yet.

Gpu and cpu rendering

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    It all depends on what your render needs are and what features you need.
    I'm a lot into GPU rendering as I'm convinced it will be a major player soonish, but it's best to try for yourself first. If you need any help testing a scene, I'll be glad to pass it through my GPU node so you'll see what you can get.

    In terms of comparing CPU vs GPU, if you compare both with BF/LC, GPU will always be a winner for speed/$, but with IR, CPU is way faster.
    One more reason why we all wait for IR on GPU
    Stan

    Comment


    • #17
      Originally posted by glorybound View Post
      I wouldn't base your decision on me, at all. I haven't invested much time in seeing what the issue is, which could very well be me.
      i know just asked a few nerd buddies of mine and they say forget it for now just stick to the easiest solution of cpu and more cores because if it pays for itself no need for the hassle
      Architectural and Product Visualization at MITVIZ
      http://www.mitviz.com/
      http://mitviz.blogspot.com/
      http://www.flickr.com/photos/shawnmitford/

      i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

      Comment


      • #18
        Originally posted by Sbrusse View Post
        It all depends on what your render needs are and what features you need.
        I'm a lot into GPU rendering as I'm convinced it will be a major player soonish, but it's best to try for yourself first. If you need any help testing a scene, I'll be glad to pass it through my GPU node so you'll see what you can get.

        In terms of comparing CPU vs GPU, if you compare both with BF/LC, GPU will always be a winner for speed/$, but with IR, CPU is way faster.
        One more reason why we all wait for IR on GPU
        Thanks buddy!! will take you up on that one unexpected day
        Architectural and Product Visualization at MITVIZ
        http://www.mitviz.com/
        http://mitviz.blogspot.com/
        http://www.flickr.com/photos/shawnmitford/

        i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

        Comment


        • #19
          Originally posted by mitviz View Post
          i know just asked a few nerd buddies of mine and they say forget it for now just stick to the easiest solution of cpu and more cores because if it pays for itself no need for the hassle
          As you have seen, there is a wide range of opinions, experiences, and workflow issues regarding rendering with GPUs in V-Ray. Honestly, your best bet is to grab yourself a couple of 980TIs or Titans and at least learn to use RT/GPU as a tool for helping to quickly set up lighting, materials, camera composition, etc. Only then will you really have a good feel for whether or not RT/GPU is or will be a production rendering solution for your particular application(s).

          For me, it is both and more. I easily and quickly render all sorts of product visualization and ArchVis exterior jobs with RT/GPU, but still find myself using the CPU for ArchVis interiors. I nearly always use my GPUs to quickly set up scenes as noted above.

          I can also tell you that using RT/GPU is some of the best fun I've had in a 3D app since I started, way back in the early 80s! There is nothing like seeing lighting, materials, reflections, refractions, Global Illumination, etc. in near real-time. (especially for an old-timer!)

          I can also tell you that with the simple addition of Directional Light support and some simple noise reduction (coming soon!) I would have used my GPUs to render my last ArchVis interior job.

          Best of luck to you and I hope you get to use it extensively soon,

          -Alan

          Comment


          • #20
            Originally posted by Alan Iglesias View Post
            As you have seen, there is a wide range of opinions, experiences, and workflow issues regarding rendering with GPUs in V-Ray. Honestly, your best bet is to grab yourself a couple of 980TIs or Titans and at least learn to use RT/GPU as a tool for helping to quickly set up lighting, materials, camera composition, etc. Only then will you really have a good feel for whether or not RT/GPU is or will be a production rendering solution for your particular application(s).

            For me, it is both and more. I easily and quickly render all sorts of product visualization and ArchVis exterior jobs with RT/GPU, but still find myself using the CPU for ArchVis interiors. I nearly always use my GPUs to quickly set up scenes as noted above.

            I can also tell you that using RT/GPU is some of the best fun I've had in a 3D app since I started, way back in the early 80s! There is nothing like seeing lighting, materials, reflections, refractions, Global Illumination, etc. in near real-time. (especially for an old-timer!)

            I can also tell you that with the simple addition of Directional Light support and some simple noise reduction (coming soon!) I would have used my GPUs to render my last ArchVis interior job.

            Best of luck to you and I hope you get to use it extensively soon,

            -Alan
            actually i am considering grabbing a more powerful graphics card, for now my card struggles using rt for most scenes and so its why i stayed away from it since now but i saw the video chaosgroup posted today about cpu and gpu, was informative i must say but gpu rendering will require a different setup which i am also looking at, so i am very familiar with RT but haven't used it for a while, when i change this card i will start to look more into gpu renderings
            Architectural and Product Visualization at MITVIZ
            http://www.mitviz.com/
            http://mitviz.blogspot.com/
            http://www.flickr.com/photos/shawnmitford/

            i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

            Comment


            • #21
              We've tried 'on-and-off' with RT GPU. In fact, the more recent workstation purchases we made were spec'd with Titan cards as we hoped to be making more use of RT. Because we have so many years experience with the more typical CPU rendering and have become reliant on a particular workflow and set of tools/plugins, it has been difficult to move over to RT in any meaningful way.

              With the best of intentions, we often start a project and say to eachother: "Right, I'm going to get this one working with RT GPU!". We very soon hit a hurdle, such as a crash, or RT not behaving as we'd expect, or we need to use a BerconTile/Multitexture, or we need to plonk in a whole load of trees and soon resort to traditional CPU rendering.

              My opinion is that you have to be absolutely committed to making full use of RT GPU to see real benefits. All our farm would have top end GFX cards in them (just as they have top end CPUs in them right now). We would need to tackle head-on the lack of support for Bercon/Multitexture and find a viable alternative. We would have to be far more intelligent about our use of ForestPro and Railclone objects and not just chuck 'em everywhere 'cause its so easy!

              If we were starting this business again, we would perhaps build our systems and workflows to employ RT GPU rather than trying to shoehorn our existing pipeline and methods into it.

              (Secretly, I'm crossing my fingers that we soon won't need to make such a decision, and VRay will be coded to use an integrated combination of whatever hardware a machine has. CPU for IRmaps, none-standard plugins and scripts. The GPU would kick in to help out with the things it is good at - have I read somewhere that the denoiser might run nicely on a GPU alongside the CPU?)
              Kind Regards,
              Richard Birket
              ----------------------------------->
              http://www.blinkimage.com

              ----------------------------------->

              Comment


              • #22
                Originally posted by tricky View Post
                With the best of intentions, we often start a project and say to eachother: "Right, I'm going to get this one working with RT GPU!". We very soon hit a hurdle, such as a crash, or RT not behaving as we'd expect, or we need to use a BerconTile/Multitexture, or we need to plonk in a whole load of trees and soon resort to traditional CPU rendering.
                Multitexture is already supported, and if you still have any of those crashing scenes around, I would like very much to take a look

                Originally posted by tricky View Post
                My opinion is that you have to be absolutely committed to making full use of RT GPU to see real benefits.
                Definitely this helps a lot, yes.

                Originally posted by tricky View Post
                (Secretly, I'm crossing my fingers that we soon won't need to make such a decision, and VRay will be coded to use an integrated combination of whatever hardware a machine has. CPU for IRmaps, none-standard plugins and scripts. The GPU would kick in to help out with the things it is good at - have I read somewhere that the denoiser might run nicely on a GPU alongside the CPU?)
                That's correct, the denoiser will use the GPUs, regardless of if you are rendering using Adv or RT, and works many times faster on a GPU (http://forums.chaosgroup.com/showthr...ight=benchmark).
                We are looking for more places for such approaches as well.
                V-Ray fan.
                Looking busy around GPUs ...
                RTX ON

                Comment


                • #23
                  Is BerconTile also supported? Multitexture on its own is only so much use. Plugging it into BerconTile is crucial for us.
                  Kind Regards,
                  Richard Birket
                  ----------------------------------->
                  http://www.blinkimage.com

                  ----------------------------------->

                  Comment


                  • #24
                    Originally posted by tricky View Post
                    Is BerconTile also supported? Multitexture on its own is only so much use. Plugging it into BerconTile is crucial for us.
                    No, bercon tiles are not supported yet.
                    V-Ray fan.
                    Looking busy around GPUs ...
                    RTX ON

                    Comment


                    • #25
                      Originally posted by savage309 View Post
                      No, bercon tiles are not supported yet.
                      Bugger. I was getting quite excited for a minute there!
                      Kind Regards,
                      Richard Birket
                      ----------------------------------->
                      http://www.blinkimage.com

                      ----------------------------------->

                      Comment

                      Working...
                      X