Announcement

Collapse
No announcement yet.

Gpu render

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Hi Vlado, about that "version of the LC". So, VRay-RT will be biased? How LC can help VRay-RT to clean-up the noise
    www.francescolegrenzi.com

    VRay - THE COMPLETE GUIDE - The book
    Corona - THE COMPLETE GUIDE - The book


    --- FACEBOOK ---

    Comment


    • #17
      Vlado,
      Wow proxy view? That's awesome. I hate to be an ingrate and complain about the displacement not making it. When is the next SP? Or is it a new version?
      Thanks
      Carlos Grande
      www.hypertecture.com

      Comment


      • #18
        The next update should come out shortly after the official release of 3ds Max 2011.

        Best regards,
        Vlado
        I only act like I know everything, Rogers.

        Comment


        • #19
          Does the next version of Vray RT have support for Xref Scenes/Objects?
          Maxscript made easy....
          davewortley.wordpress.com
          Follow me here:
          facebook.com/MaxMadeEasy

          If you don't MaxScript, then have a look at my blog and learn how easy and powerful it can be.

          Comment


          • #20
            They should be working even in the current version; if they don't - can you get me a scene to vlado@chaosgroup.com?

            Best regards,
            Vlado
            I only act like I know everything, Rogers.

            Comment


            • #21
              Have done a quick basic test and it doesn't work with Xref Scene, have sent you the example scene.

              Dave
              Maxscript made easy....
              davewortley.wordpress.com
              Follow me here:
              facebook.com/MaxMadeEasy

              If you don't MaxScript, then have a look at my blog and learn how easy and powerful it can be.

              Comment


              • #22
                Yes, I got it; thanks a lot!

                Best regards,
                Vlado
                I only act like I know everything, Rogers.

                Comment


                • #23
                  Originally posted by vlado View Post
                  It may be more interesting, but not necessarily successful Right now, the GPU is really only good for unbiased rendering, and even this is with limitations as to the complexity of lighting and materials (not to mention limited GPU memory). Granted, this will change, but for the time being, you will mostly see brute-force unbiased renderers.

                  Best regards,
                  Vlado
                  Has ChaosGroup considered using the GPU for brute force GI and then sending the result back to the regular vray renderer to process as part of the "normal" rendering?

                  Comment


                  • #24
                    What would be the advantage? If we calculate the GI on the GPU, we might as well calculate the entire image there.

                    Best regards,
                    Vlado
                    I only act like I know everything, Rogers.

                    Comment


                    • #25
                      I was just curious if a hybrid technique was doable in a practical way.

                      The advantage would be if a time consuming GI step could be done on a GPU successfully then used/interpolated by the normal rendering engine, then you have the option of implementing a partial GPU solution for end-users faster than building a physically-correct 100% GPU solution from the ground up. At least it would seem that way, but I am not a developer. My perspective is that of a single user with no render farm who does stills. Maybe this makes no difference if you have a farm or already render on a network of 100 cpus. I was thinking any speedup to the traditional procedure is good to have as an option, because the traditional procedure is reliable.

                      If it was a simple step in the rendering pipeline that didn't involve every single reflection and refraction then it would help with the memory limits people are going to hit on the normal GPU cards when they try to render everything in the scene at 3000x4000 on a 896MB card. The GI GPU phase wouldn't need to be 100% perfect since it is just a step to be interpolated later, preferably from a map file, instead of the final image that requires more development work to get perfect when GPU rendered.

                      Obviously if everything that vray does now will be on the GPU by year end then.... nevermind

                      Comment

                      Working...
                      X