Announcement

Collapse
No announcement yet.

Biased Vray GPU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Biased Vray GPU

    Maybe this belongs elsewhere or maybe I just don't understand the intent of Vray GPU... but is there way to make Vray GPU more like... well Vray CPU? I really prefer a biased renderer and was hoping that Vray GPU would be just like 'regular' Vray but you know.. on a GPU. I thought that Vray GPU would be an answer to Redshift which would be a huge win for me and my studio. I'd probably even shell money out a seat for home (where btw I use Redshift).

    From what I've seen Vray GPU produces some gorgeous images but often they take just as much time to render as the CPU version because with regular Vray I have much more control! Denoising is a great tool but I don't like to lean on it too much.

    Here are my machine specs:
    Dual Xeon Silver 4114 CPUs (hyperthreading on)
    96Gb ram
    Quadro P2000
    RTX 2080

    Let's say for example I've got a frame that Vray CPU does in 15 minutes
    The same image/shader/lighting setup with GPU gets it done in about 14.5 minutes. It looks amazing- better light bounces, proper caustics, etc... but the 15 minute frame looked 'good enough' and I'd like to get Vray GPU to produce the same image in what I hope would be significantly less time.
    My long term hope was that we could buy a series of beefy GPUs to stick into a few towers instead of needing to buy a traditional renderfarm.

    Here's something I was working on recently: There was an issue with my usual farm service choice that supports Renderman and I needed to render at home. So I switched everything from Renderman to Redshift.
    (I know, I know! Two totally different things AND a different beast than either Vray engine we're discussing but hear me out!)
    In Renderman I can get a great image in say 3 minutes but with Redshift using my GPU I can get it done in 10 SECONDS. Not exactly the same but 'good enough' especially given the tight turn around time and limited resources at that point in the project.

    Alrighty... long post just to say that I wish Vray GPU could function more like CPU!

    All that said, Vray in Houdini is GREAT just think this additional feature would make it a game changer!

  • #2
    Check this article about biased/unbiased https://www.chaosgroup.com/blog/the-...ased-rendering
    V-Ray For Houdini | V-Ray Hydra Delegate | VRayScene
    andrei.izrantcev@chaos.com
    Support Request

    Comment


    • #3
      You can indeed add biases to V-Ray GPU so to make (much!) quicker.
      It just doesn't come with those as default, but you could surely change the relevant stuff (f.e., clamps on bounces, sample energies, output, sampling strategies... The list is fairly long.) to match pretty much anything on the market.
      We still do things slightly differently than others (f.e. Redshift) in a few aspects (f.e. SSS), so it may not be exactly identical for quality or speed in cases where those were featured, but by and at large you shouldn't feel constrained by the app.
      The question is if you would be so lenient with the results as in the comparison above, or if you'd you pick at the differences between the two engines (as it's historically been the case.). ^^
      Last edited by ^Lele^; 21-08-2019, 08:35 AM.
      Lele
      Trouble Stirrer in RnD @ Chaos
      ----------------------
      emanuele.lecchi@chaos.com

      Disclaimer:
      The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

      Comment


      • #4
        V-Ray GPU just like V-Ray CPU can and shoul be used as a biased engine indeed. It is possible that you share such a scene with us, where V-Ray GPU takes so long, so that we can evaluate what the cause for that is? If it's in the settings, we will tell you what you need to tweak to get the performance you desire, or if it is a bug we will fix it.
        Alexander Soklev | Team Lead | V-Ray GPU

        Comment

        Working...
        X