Announcement

Collapse
No announcement yet.

could v-ray be a GPGPU renderer please?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • could v-ray be a GPGPU renderer please?

    well i'd been looking out, and there is a way to make very spectacular renders in less time.

    there is a new renderer ( with many bugs but with a spectacular potencial)named gelato ( nvidia renderer ) the thing is that the program uses the Graphics Processing Unit to make the calculations. these processors are much much powerful than a normal cpus ( even the most powerfull cpu could have the same or less performance than a good consummer graphics card ( speaking of geforce (8800 gtx/gts series)versions not even quadro ).( an old geforce can perform even better, making complex renders in few minutes)

    could v ray ( for a future project ) use this amazing GPGPU tecnology?


    thanks

  • #2
    Re: could v-ray be a GPGPU renderer please?

    I'm not sure if Chaos Group has been exploring this possibility - but they are always on the lookout for anything that can make their product faster.
    Best regards,
    Joe Bacigalupa
    Developer

    Chaos Group

    Comment


    • #3
      to triplicate vray speed GPGPU technology !

      to triplicate (or more)vray speed GPGPU technology !

      im gonna pray for this!

      , also i wanna ask that gamers video cards ( like geforce's or ati's equivalent) be able to use,this technology. ( well the quadro's cards, and fire gl's are very expensive and many users can access a gamer video card, but, a few ones can buy a pofessional card) thanks.

      hope this come true someday ;D

      Comment


      • #4
        Re: could v-ray be a GPGPU renderer please?

        I'm all for more speed.
        I saw a post in the Rhino NG about this- Maxwell has a benchmark forum where everyone posts their speed on a standard scene, and someone used a PlayStation and did it much faster than even Xeons. (Don't know how.)
        The downside is memory- large scenes require it and game console don't have it.
        I'm not a techie though, so maybe something can be done.

        Comment


        • #5
          Re: could v-ray be a GPGPU renderer please?

          Interesting. Slave renderings dosn't need so much RAM like master renderings, so a playstation could be good as render slave for DR rendering.
          www.simulacrum.de - visualization for designer and architects

          Comment


          • #6
            Re: could v-ray be a GPGPU renderer please?

            Originally posted by Micha
            Interesting. Slave renderings dosn't need so much RAM like master renderings, so a playstation could be good as render slave for DR rendering.
            Umm...Y would you think slaves don't need so much ram? Other then the ram saved by not running Rhino there wouldn't be any difference between the ram needed for a thread rendered on the host as opposed to the slave.

            _________________________________________________

            I was going to chime in on this a little earlier, but held my tongue a little bit. I'll ask a simple question...Why is Gelato the only GPU renderer?...Everyone knows that GPUs are faster then most processors right now, so why isn't everyone becoming a GPU based render. I'm not 100% sure, but I'd be willing to bet that its because the technology that is used by gelato to access GPU power isn't readily available to the public. Being a Nvidia project they probably have a whole lot more access to the gpu and can get things tailored for what they need. If I were Nvidia, I wouldn't release that capability, but then again thats just me.
            Damien Alomar<br />Generally Cool Dude

            Comment


            • #7
              Re: could v-ray be a GPGPU renderer please?

              well yes nvidia has full access to the gpu architecture, but, i guess then gpgpu, rendering,can be achieved by programming a software to access the gpu processor. and, also, nvidia has a developer software, called CUDA, that allows to acces the GPU processor, and allow it to solve complex calculations.

              its a shame that CUDA dont solve rendering calculations, but maybe its a tool, to acces the nvidia architecture and to study how it works.


              here is an explication about why cuda dont solve , rendering calculations (by the moment only gelato does.)

              link: http://forums.nvidia.com/index.php?showtopic=29261



              CUDA tool kit: http://developer.nvidia.com/page/home.html




              Comment

              Working...
              X