Announcement

Collapse
No announcement yet.

12GB VRAM for $3,000

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by beestee View Post
    Same goes for dual GPU cards unfortunately (as savage309 points out), but this may be changing in both cases soon from what I gather.
    With current technologies it will be huge speed penalty to do so.
    Faster ones (like nvlink) are coming in 2016
    V-Ray fan.
    Looking busy around GPUs ...
    RTX ON

    Comment


    • #17
      Originally posted by beestee View Post
      Same goes for dual GPU cards unfortunately (as savage309 points out), but this may be changing in both cases soon from what I gather.

      so . please confirm for me ... this card will only be able to load 6gb of a proj into card , as it will divide the 12gb by the 2 processors? if that is the case the only benefit is form factor benefit? wow! harder sell

      Comment


      • #18
        Originally posted by abowen View Post
        so . please confirm for me ... this card will only be able to load 6gb of a proj into card , as it will divide the 12gb by the 2 processors? if that is the case the only benefit is form factor benefit? wow! harder sell
        Yea that's correct

        You basically gain compactness. So you can stick more of them in ur workstation. Basically 4 pcie slots can either give you 4GPU or 8 GPU.
        CGI - Freelancer - Available for work

        www.dariuszmakowski.com - come and look

        Comment


        • #19
          Originally posted by abowen View Post
          so . please confirm for me ... this card will only be able to load 6gb of a proj into card , as it will divide the 12gb by the 2 processors? if that is the case the only benefit is form factor benefit? wow! harder sell
          That's correct.

          edit : late response.
          V-Ray fan.
          Looking busy around GPUs ...
          RTX ON

          Comment


          • #20
            Originally posted by RockinAkin View Post
            Yes! Video please!
            Here is the recording of my presentation: http://nvidia.fullviewmedia.com/gtc2014/S4779.html

            Though the video that I show with the nVidia cluster is really just playing around, we should do a more impressive one...

            Best regards,
            Vlado
            I only act like I know everything, Rogers.

            Comment


            • #21
              Originally posted by vlado View Post
              Here is the recording of my presentation: http://nvidia.fullviewmedia.com/gtc2014/S4779.html

              Though the video that I show with the nVidia cluster is really just playing around, we should do a more impressive one...

              Best regards,
              Vlado
              Thanks Vlado! Very interesting stuff.

              I quite like the cluster implementation, specially the option to dynamically add DBR nodes. Do you think that thing can be ported in any way to Maya ? or be cmd controlled?
              CGI - Freelancer - Available for work

              www.dariuszmakowski.com - come and look

              Comment


              • #22
                daaaaaaaaaaaaaaaaaaamn thats fast!
                James Burrell www.objektiv-j.com
                Visit my Patreon patreon.com/JamesBurrell

                Comment


                • #23
                  Originally posted by Dariusz Makowski (Dadal) View Post
                  Do you think that thing can be ported in any way to Maya?
                  Yes, the 3.0 SDK does allow render nodes to be added (or removed) on the fly, we just need to figure out a good UI for this.

                  Best regards,
                  Vlado
                  I only act like I know everything, Rogers.

                  Comment


                  • #24
                    Originally posted by vlado View Post
                    Yes, the 3.0 SDK does allow render nodes to be added (or removed) on the fly, we just need to figure out a good UI for this.

                    Best regards,
                    Vlado
                    Wow thats awesome. It would be great if it can be standalone ui tho... then again I think it might be hard given different jobs, deadline and so on huh... still nice to know its coming
                    CGI - Freelancer - Available for work

                    www.dariuszmakowski.com - come and look

                    Comment


                    • #25
                      id like to run the vray cloud on this oil cooled nv cluster...

                      http://blogs.nvidia.com/wp-content/u.../kfc-nodes.jpg

                      Comment


                      • #26
                        so now just wait a year and get it for 1500$
                        Luke Szeflinski
                        :: www.lukx.com cgi

                        Comment


                        • #27
                          Its always a damn car though. Cars are easy peasy
                          Kind Regards,
                          Richard Birket
                          ----------------------------------->
                          http://www.blinkimage.com

                          ----------------------------------->

                          Comment


                          • #28
                            Originally posted by tricky View Post
                            Its always a damn car though. Cars are easy peasy
                            +1
                            Yes I'm yet to get RT to work with my type of scenes
                            Kind Regards,
                            Morne

                            Comment


                            • #29
                              Originally posted by tricky View Post
                              Its always a damn car though. Cars are easy peasy
                              +1
                              Every time I try RT GPU on an arch interior something is always not supported or broken.
                              "I have not failed. I've just found 10,000 ways that won't work."
                              Thomas A. Edison

                              Comment


                              • #30
                                Originally posted by eyepiz View Post
                                Every time I try RT GPU on an arch interior something is always not supported or broken.
                                If you want to use the GPU successfully, it would be best to do your scenes with it from the start. Just taking an existing scene for the regular V-Ray and throwing it at the GPU will likely always come across some little option somewhere that's not supported or works slightly differently.

                                Best regards,
                                Vlado
                                I only act like I know everything, Rogers.

                                Comment

                                Working...
                                X