Announcement

Collapse
No announcement yet.

V-Ray RT on a GPU - SIGGRAPH 2009 Demo

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    George is right, there is still a substantial amount of work to be done on this before it is useable. Hopefully we'll be able to keep you up to date with our progress

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #47
      Of course, but RT in itself is a huge step, you big bloody negative nancy
      Kind Regards,
      Richard Birket
      ----------------------------------->
      http://www.blinkimage.com

      ----------------------------------->

      Comment


      • #48
        not sure how relevant this is:

        http://www.theregister.co.uk/2009/08...encl_for_cpus/

        first line of article (dated 5 August) is:
        AMD has released a free update to its ATI Stream SDK that offers OpenCL support for CPUs, taking the power of that parallel-processing technology one step closer to true usability.

        Comment


        • #49
          V-Ray RT looks very promising indeed, many kudos to Chaos Group.

          I've got a few questions, some of which have been asked before in this thread but have not been answered yet:

          1: Will the CPU V-Ray become obsolete when you've got V-Ray RT running on a machine with a decent GPU card?

          2A: Will V-Ray RT be 100 percent compatible with everything the CPU V-Ray can do? In other words: will it also render native Max features, such as Standard materials with all kinds of Max maps (Self-illumination, Falloff, etc.), also without GI (behaving like the standard Scanline renderer) ?

          2B: Will V-Ray RT be able to render the same top quality as the CPU V-Ray with maximum quality settings (e.g. universal settings with 0.001 noise treshold) ?

          3: Will V-Ray RT be completely integrated into the Max interface (no external modules) ?

          4: Do you need specific drivers / libraries and such in order to have V-Ray RT recognize and fully utilize your GPU card?

          5: What kind of GPU card is most recommended to fully harness the power of V-Ray RT?

          Many thanks in advance for the answers!
          Sevensheaven.nl — design | illustration | visualization | cartoons | animation

          Comment


          • #50
            Originally posted by Metin_7 View Post
            V-Ray RT looks very promising indeed, many kudos to Chaos Group.

            I've got a few questions, some of which have been asked before in this thread but have not been answered yet:

            1: Will the CPU V-Ray become obsolete when you've got V-Ray RT running on a machine with a decent GPU card?

            2A: Will V-Ray RT be 100 percent compatible with everything the CPU V-Ray can do? In other words: will it also render native Max features, such as Standard materials with all kinds of Max maps (Self-illumination, Falloff, etc.), also without GI (behaving like the standard Scanline renderer) ?

            2B: Will V-Ray RT be able to render the same top quality as the CPU V-Ray with maximum quality settings (e.g. universal settings with 0.001 noise treshold) ?

            3: Will V-Ray RT be completely integrated into the Max interface (no external modules) ?

            4: Do you need specific drivers / libraries and such in order to have V-Ray RT recognize and fully utilize your GPU card?

            5: What kind of GPU card is most recommended to fully harness the power of V-Ray RT?

            Many thanks in advance for the answers!
            You're probably asking all these way too early on. I'll throw out some ideas for answers that may be completely inaccurate:

            1: The GPU "version" will probably be added functionality to the existing VRay RT rather than a seperate product. So yes if you have the GPU for it, the CPU will be "obsolete" in that it will no longer be the work-horse. The CPU, obviously, is still necessary.

            2A: I really, really doubt it. If we waited for ChaosGroup to add compatiblity, and, in essence, translators for every Max plugin type out there, we wouldn't see VRay RT GPU for at least 5 years. As with the existing version of RT and VRay itself, compatibility may be added over time, but the more likely scenario is plugins developers will write VRay versions of their plugins as uptake of VRay RT increases and demand for such a thing exists.

            2B: I imagine it will do just this as long as you're comparing it with raytraced features, and not pre-calculated or interpolated features. As long as you leave it cooking long enough..

            3: VRay RT will probably continue to rely on hooks into Max, with the rendering engine as a background process. There aren't many downsides to this that i'm aware of.

            4: I doubt it, CUDA drivers i'm pretty sure are part of the standard NVIDIA driver package.

            5: Ask this again when it's close to release. Vlado mentioned that they may not stick with CUDA and instead go with a non-NVIDIA specific standard like OpenCL. This may or may not change the video card requirements, but I imagine the "best" card you can get will be advised at the time of product release.

            Comment


            • #51
              Many thanks for your reply George, much appreciated.

              Everything sounds great, except for the possible lack of compatibility with all Max features that the CPU V-Ray can handle.

              I use V-Ray a lot to render scenes with Standard Max materials using Max maps for creating non-photorealistic shaders with lots of tricks. V-Ray without GI renders such scenes faster than the outdated Scanline, especially when the scene is complex in terms of polygons.
              Sevensheaven.nl — design | illustration | visualization | cartoons | animation

              Comment


              • #52
                Regarding #5 : One of the biggest advantages of OpenCL is beeing both vendor agnostic AND heterogenous environments (= uses both CPU, GPUs and any processing unit that has an OpenCL driver like PhysX Cards if you got any. So basically it will use anything that can calculate and isnt limited to GPUs)

                Regards,
                Thorsten

                Comment


                • #53
                  Cool! I'm really excited about this development. Have been waiting for GPU power to be harnessed for rendering for years now.
                  Sevensheaven.nl — design | illustration | visualization | cartoons | animation

                  Comment


                  • #54
                    Originally posted by Metin_7 View Post
                    Cool! I'm really excited about this development. Have been waiting for GPU power to be harnessed for rendering for years now.
                    I'll no longer feel the need to kick someone in the head when they ask "what's the fastest video card for rendering in my legitimate version of 3d Studio Max?"

                    Comment


                    • #55
                      Hahaha! This reminds me a little of a childhood friend who always asked how fast one could run with particular sports shoes. I just couldn't succeed to explain that it was not the shoes that determined how fast one could run.
                      Sevensheaven.nl — design | illustration | visualization | cartoons | animation

                      Comment


                      • #56
                        True. A lot depends on what breed of dog is chasing you.

                        Just got approval for VRayRT and I can't wait to start playing with it.

                        Comment


                        • #57
                          For what it's worth I would like to see VRayRT GPU move to OpenCL right now, so as to get away from anything proprietary. Like many offices we have a mix of nvidia and ati cards, it would suck horribly if we could only use nvidia
                          http://www.glass-canvas.co.uk

                          Comment


                          • #58
                            Originally posted by devin View Post
                            Is there a reason they chose to use a GeForce card over a Quadro card? Will this technology work with other brands of video cards?
                            Is there any reason why any of us would choose a quatro over a geforce? Not last time i checked.
                            WerT
                            www.dvstudios.com.au

                            Comment


                            • #59
                              I've only used GeForce cards once in my machines and didn't like them, I've always used Quadro. The way I understand it is the GeForce cards are for gamers and the Quadro cards are for professionals.

                              Comment


                              • #60
                                Originally posted by werticus View Post
                                Is there any reason why any of us would choose a quatro over a geforce? Not last time i checked.
                                I've read several tests which concluded that the Quadro is not merely a GeForce that's souped-up with some minor tweaks and software. I'm using Quadro for a number of years now and I'm quite content with it.

                                Now I'm only waiting for V-Ray to support the GPU, so rendering times will never be the same again.
                                Sevensheaven.nl — design | illustration | visualization | cartoons | animation

                                Comment

                                Working...
                                X