Announcement

Collapse
No announcement yet.

GPU rendering on commandline

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GPU rendering on commandline

    I'm exploring the possibilities of utilizing the GPU for commandline rendering.

    I see that the -rtEngine option is available for vray standalone, but what about when rendering through Maya's batchrender, can it be done?
    I'm talking about rendering via "Render -r vray .... scene.ma"

    Also, if using the -rtEngine option (we would use CUDA), can this be combined with the -distributed option?
    Best Regards,
    Fredrik

  • #2
    Originally posted by Fredrik Averpil View Post
    I see that the -rtEngine option is available for vray standalone, but what about when rendering through Maya's batchrender, can it be done? I'm talking about rendering via "Render -r vray .... scene.ma"
    No, for the moment it is not possible. We will be working on this.

    Also, if using the -rtEngine option (we would use CUDA), can this be combined with the -distributed option?
    It should work, yes.

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      Hi Vlado,

      No, for the moment it is not possible. We will be working on this.
      Allright, that's great news!


      It should work, yes.
      Even with progressive?
      Best Regards,
      Fredrik

      Comment


      • #4
        Originally posted by Fredrik Averpil View Post
        Even with progressive?
        Yes. Depending on the scene you might need to tune the rays per pixel/bundle size to get better performance with DR though, especially if the scene renders fast and the slaves generate a lot of data.

        Best regards,
        Vlado
        I only act like I know everything, Rogers.

        Comment


        • #5
          That is really impressive. How come you haven't pushed this feature harder in your marketing?
          Best Regards,
          Fredrik

          Comment


          • #6
            Originally posted by Fredrik Averpil View Post
            That is really impressive. How come you haven't pushed this feature harder in your marketing?
            I don't know; isn't it a very specific use case?

            Best regards,
            Vlado
            I only act like I know everything, Rogers.

            Comment


            • #7
              Using it on a daily basis here and thanking the gods of looming deadlines. It works, works fast and is becoming our go-to method.
              I'm with Fredrik on that line of thought. You should definitely hilight the ability, i get the impression many are unaware its possible.

              Comment


              • #8
                Originally posted by vlado View Post
                I don't know; isn't it a very specific use case?

                Best regards,
                Vlado
                Well, I'm going to need to run some tests before I know how specific this use case is and exactly how useful RT+DR would be for us... but I'm thinking we could use our nVidia K5200 equipped workstations to create a "GPU farm", which could potentially churn out previz and/or highres packshots quite fast (relative to our CPU based render farm) and leave our regular farm available for animation/final rendering jobs.

                I don't know if these numbers are right but someone said those K5200s could theoretically be 14 times the speed of one of our regular render farm blades on a single product image... (seems quite exaggerated to me though and extremely loosely based on whatever)
                Best Regards,
                Fredrik

                Comment


                • #9
                  If you are rendering animations, why don't you do regular frame by frame rendering?

                  Best regards,
                  Vlado
                  I only act like I know everything, Rogers.

                  Comment


                  • #10
                    If you are rendering animations, why don't you do regular frame by frame rendering?
                    We do - we wouldn't use GPU/DR rendering for that.

                    What I mean is that GPU rendering is considered to render certain scenes much faster than traditional CPU rendering. So if you take 10 machines and tuck a good GPU inside of it, it could compare to 15 or 20 (or more?) regular CPU based render nodes. But for stills, you'd have a little bit of a problem (perhaps) to combine the power of those machines when rendering just one single image since the GPUs are not in the same box. But using RT+DR you could probably combine the power of these 5 GPUs and get quite nice render times, I presume?
                    Best Regards,
                    Fredrik

                    Comment


                    • #11
                      OK, yes that's true, for still images DR is the only way.

                      Best regards,
                      Vlado
                      I only act like I know everything, Rogers.

                      Comment

                      Working...
                      X