Announcement

Collapse
No announcement yet.

1x gpu for vantage + 1x gpu for 3ds max viewport

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 1x gpu for vantage + 1x gpu for 3ds max viewport

    Hey there!

    Amazing software! I am using it for my renders now and happy to sacrifice a few unsupported features in vray + some quality.

    My biggest issue is I have to run vantage at like 500 x XXX pixels (super low) in order to run 3ds max viewport at a functional and smooth rate.

    Is it possible to have 1x gpu utilized for vantage while 1x gpu utilized by 3ds max viewport at the same time?

    I love working in live link... It's a super fast development process and a game changer for me!

    Here is a render that I have done inside vantage

    Thanks!


  • #2
    Glad you're liking Vantage!
    To separate your GPU usage, you can try the following - there is an NVIDIA environment variable called cuda_visible_devices. Setting this to 0 will use your first GPU, setting it to 1 will use your second GPU, while setting it to 0,1 would use both GPUs. To use it, you need to launch your application in a shell that has this variable set. To experiment, use the CMD window, set cuda_visible_devices=0 and then run 3dsmax.exe. Now do the same for Vantage from another CMD window, with set cuda_visible_devices=1. Both applications should now just see 1 GPU each. If it's working well for you, you can make the variable setting part of your shortcut properties.

    - Phil
    - Phil

    VP Product Management, Chaos Group

    Comment


    • #3
      Just keep in mind you have to start Vantage like this before starting Live Link. Otherwise the script in 3dsmax will try to start a new instance of vantage.exe without this customization.
      Nikola Goranov
      Chaos Developer

      Comment


      • #4
        Originally posted by Phillip Miller View Post
        Glad you're liking Vantage!
        To separate your GPU usage, you can try the following - there is an NVIDIA environment variable called cuda_visible_devices. Setting this to 0 will use your first GPU, setting it to 1 will use your second GPU, while setting it to 0,1 would use both GPUs. To use it, you need to launch your application in a shell that has this variable set. To experiment, use the CMD window, set cuda_visible_devices=0 and then run 3dsmax.exe. Now do the same for Vantage from another CMD window, with set cuda_visible_devices=1. Both applications should now just see 1 GPU each. If it's working well for you, you can make the variable setting part of your shortcut properties.

        - Phil
        Thanks for sharing!

        Could you please give me a bit more detail on how to activate "set cuda_visible_devices=0" in CMD? I can't get it to be a valid variable/comment.

        I have googled how to access NVIDIA in cmd but I am just not that skilled in C language or programming in general.

        Would you be so kind as to give me a quick write up on the needed commands to initialise set cuda_visible_devices=0 ???

        Cheers!

        Comment


        • #5
          Originally posted by Phillip Miller View Post
          Glad you're liking Vantage!
          To separate your GPU usage, you can try the following - there is an NVIDIA environment variable called cuda_visible_devices. Setting this to 0 will use your first GPU, setting it to 1 will use your second GPU, while setting it to 0,1 would use both GPUs. To use it, you need to launch your application in a shell that has this variable set. To experiment, use the CMD window, set cuda_visible_devices=0 and then run 3dsmax.exe. Now do the same for Vantage from another CMD window, with set cuda_visible_devices=1. Both applications should now just see 1 GPU each. If it's working well for you, you can make the variable setting part of your shortcut properties.

          - Phil
          Do I need to set these up in environment variables first?

          Comment


          • #6
            Processes inherit environment variables from their parent process. So Phil is suggesting you create a CMD process, set the variable there (so it applies to this process) and then start 3dsmax.exe and vantage.exe respectively from separate consoles. Each will inherit the variables of the respective console. Setting environment variables in a console is done literally with "set variable_name=value" without the quotes. You can check a variable with "set variable_name".

            I haven't tried Phil's method though. It uses a CUDA variable and Vantage is not a CUDA application. Maybe it works. But if it doesn't you can instead use "vantage_console.exe -deviceList" and then "vantage_console.exe -device=index" where index is 0 or 1 or whatever the index of the GPU you want to use.
            Nikola Goranov
            Chaos Developer

            Comment


            • #7
              Originally posted by npg View Post
              Processes inherit environment variables from their parent process. So Phil is suggesting you create a CMD process, set the variable there (so it applies to this process) and then start 3dsmax.exe and vantage.exe respectively from separate consoles. Each will inherit the variables of the respective console. Setting environment variables in a console is done literally with "set variable_name=value" without the quotes. You can check a variable with "set variable_name".

              I haven't tried Phil's method though. It uses a CUDA variable and Vantage is not a CUDA application. Maybe it works. But if it doesn't you can instead use "vantage_console.exe -deviceList" and then "vantage_console.exe -device=index" where index is 0 or 1 or whatever the index of the GPU you want to use.
              Hey Nikola!

              Thanks for the response.

              I made a batch file for vantage with this is in:

              set cuda_visible_devices=0
              "C:\Program Files\Chaos Group\Vantage\vantage.exe"
              PAUSE

              It works!
              I switched it to device 1 (gtx 1030) and it crashed, so I know it works since that 2gb vram won't work I bet.

              However, I used this for 3ds max:

              set cuda_visible_devices=1
              "C:\Program Files\Autodesk\3ds Max 2021\3dsmax.exe"
              PAUSE

              It is not using the 1030. However, it shows that it is using the 1030 as selected in vray "render gpu devices selected" inside of 3dsmax.. Wonder what is going on there? Maybe since it is a 2gb vram gpu, the 3090 is being forced to help?

              Do you think I could add anything? Or am I doing anything wrong running separate batch files?

              Cheers,
              Cam

              Comment


              • #8
                Your batch files look fine. Again, since this is a CUDA variable, I'm not sure what it affects and what it doesn't affect. Vantage is a DirectX application and apparently the variable still affects it. V-Ray GPU uses CUDA so it is obviously also affected. But I don't know about 3dsmax itself... Someone more familiar with the matter should chime in.
                Nikola Goranov
                Chaos Developer

                Comment

                Working...
                X