Announcement

Collapse
No announcement yet.

RT GPU questions

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • RT GPU questions

    Hi all

    So since I 1st saw the power of RT way back in 2007, I've never really had much success using it. However NOW seem to be a time where it's maturing nicely and a good time to get into it. (possible even switching entire workflow over to GPU rendering instead of the traditional advanced render using CPU. I do various things for various industries, but a very large part of it is for Archviz, I'm also getting into Automotive.

    So here goes:

    1
    1A) Can I mix and match different NVidia cards into a single PC, and use all of them together for a single render?
    1B) How much of a factor is GPU ram here? Across different cards? Does it use total RAM on various cards for a frame, or the card with the most RAM?
    1C) For example if traditional advanced CPU use to use about 24GB ram for a scene, how will I render that scene using GPU? (single or say 2 or 3 cards in 1 pc)?

    2
    2A) What is better, a bunch of LOW END machines with a single HIGH end GPU (like GTX 980), or better to get like 3 X GTX 980 in my workstation?
    2B) In terms of building a small farm, what is more cost effective CURRENTLY, NON RT traditional DR machines or bunch of RT GPU's either in bunch of single machines or 3 GPU in single machine?

    3
    Budget is always an issue. With that in mind, what is a good COST/Performance GPU TODAY?
    1) 980? or 980ti?
    2) Titan X or Titan Z? (they're a bit expensive locally, cost as much as an entire workstation)
    3) Quadro K5000? Yeh right, way too costly
    4) Fill in your suggestion here please

    Any other comments welcome!
    Last edited by Morne; 15-01-2016, 03:33 AM.
    Kind Regards,
    Morne

  • #2
    ok here goes:

    i was in a similar position to you rencently.. bought a titan x and thought "now is the moment!"

    but. i find the limitations still quite limiting (i usually find a limitation within a few hours of starting something) - we are all accustomed to our nice feature list by now!

    gpu ram limit is a pig (depending on your scene complexity of course) 12 gb is quite generous on titan x, but be aware, things like proxies and displacement etc are loaded -in full- into the gpu ram before rendering, so certain scenes need lots *more* gpu ram, than they would cpu ram, using advanced on the cpu.

    now to your questions:

    1a) yes as long as they are "modern" ones.
    1b) as much gpu ram as you can get. the card with least ram will be your ram limit for the scene unless you remove that card from the render.

    1c) you wont render it without serious optimising

    2a) lots of gpus in few machines is best.

    2b) few machines with lots of gpus most cost effective.

    3a) 980ti i believe is best bet at the moment but somebody else may know better.. titan x is great , but costy.
    3b) titan x, unless you are sure your scene will fit in 6gb ( the z has 2 gpus with 6gb each)
    3c) nope!

    suggestion: get a couple of cards and give it a go. dont expect it to replace adv. for your day to day work, unless your requirements scene and complexity wise are modest.
    suggestion: dont try and render old scenes, they never work. design the scene from scratch using RT. you can always "upgrade" to full vray if you need to.

    Comment


    • #3
      I completely agree with super gnu.

      I would only suggest to not run your display with a rendering GPU it will lag like hell while rendering. I made the mistake of sticking 4 titans in my PC and while it will work for small test stuff, when you load a fairly complex scene on the GPU your PC becomes very unresponsive practicably unsaleable, you have to disable the display GPU from the device list if you want to do something else, like open a web browser.

      Like super gnu suggested design the scene from scratch, even take a look at any props prior to loading them in the scene, having to take apart a scene just to find out that the weird shader from the turbosquid prop was crashing the scene is a time killer, GPU is not as forgiving as the advanced CPU render.
      "I have not failed. I've just found 10,000 ways that won't work."
      Thomas A. Edison

      Comment

      Working...
      X