Announcement

Collapse
No announcement yet.

NVIDIA Kepler

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Kepler

    Seems that the upcoming NVIDIA Kepler cards will be the way to go regarding V-Ray RT rendering.

    NVIDIA Kepler GK104 gets 1536 CUDA cores
    http://www.nordichardware.com/news/7...uda-cores.html

    If I remember correctly it is three times the amount of current GTX 580 cards.
    Last edited by brainspoon; 14-03-2012, 04:08 AM.
    http://www.andreas-reimer.de
    http://www.renderpal.com
    my HDRI and texture collection

  • #2
    Can't wait

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      Originally posted by brainspoon View Post
      If I remember correctly it is three times the amount of current GTX 580 cards.
      Yup, throw two of these puppies in and you have the equivalent of six GTX-580s! ...And the GPU clock freq starts at 950MHz!

      The future looks better all the time...

      -Alan

      Comment


      • #4
        I hope to buy a something of it has half core and eats half power consumption.
        :: twitter :: Portfolio :: My 3D Products :: ...and ::

        Comment


        • #5
          I think this is also supposed to be the weakest of the Kepler cards... the bigger models are coming out later in the year.

          Best regards,
          Vlado
          I only act like I know everything, Rogers.

          Comment


          • #6
            I guess now its a good time to start whinning about render passes support for GPU ! =D
            CGI - Freelancer - Available for work

            www.dariuszmakowski.com - come and look

            Comment


            • #7
              Originally posted by vlado View Post
              I think this is also supposed to be the weakest of the Kepler cards... the bigger models are coming out later in the year.
              Yeah, they're saying it will be the entry level card..sheesh!

              -Alan

              Comment


              • #8
                They are already planning the dual core card
                http://www.andreas-reimer.de
                http://www.renderpal.com
                my HDRI and texture collection

                Comment


                • #9
                  If they will give us 1gb model and ask £6k for 6gb model I go buy shotgun and go after nvidia HQ ! >.<
                  CGI - Freelancer - Available for work

                  www.dariuszmakowski.com - come and look

                  Comment


                  • #10
                    Yes, the problem, again, is the RAM. This is a game card, so, 2 or 3 GB is enough to play with Crisis4 or UT.
                    With 2 Gb RAM we can only render some nice, but not heavy, scene. In practice, only modern interiors and objects design.
                    It's ok if your job is "only" modern catalogue (and I've friend, in two weeks, with different RT engine) can create a 20 or 30 images 5000px noiseless with 6 or 8 580
                    IMO VRay-Rt GPU is a little behind the others. And, I've test here that, with the same noise lever, VRay-RT GPU is 1/2 slow.
                    I don't know if it's the difference from CUDA and OpenCL. But the numbers cannot wrong =)
                    Anyway, return back about the new GPU, yes, without RAM is not really useful for big interior project or exterior.
                    We have our library with all the shader usually 4 or 8k, all our model with high polycount etc...
                    If I will try to think to work with GPU, I think we need to reduce all the texture/models structure etc...
                    Fight with GPU to see if I've enough RAM... mmm... and 6 GB IMO, is not enough! With 12 GB we can start to think in professional mode.
                    And, if you think that with the new desktop can have not expensive 32 GB RAM ( 4GB x 8 ), with 2 GB on GPU available, I smile!
                    Where we go with 2 or 4 GB ehehe )))
                    Last edited by cecofuli; 20-03-2012, 08:07 AM.
                    www.francescolegrenzi.com

                    VRay - THE COMPLETE GUIDE - The book
                    Corona - THE COMPLETE GUIDE - The book


                    --- FACEBOOK ---

                    Comment


                    • #11
                      yeah it will be years to support that, if we look at how they scale things in favor of cores not ram. i
                      I guess tiled exrs would a solution.
                      Dmitry Vinnik
                      Silhouette Images Inc.
                      ShowReel:
                      https://www.youtube.com/watch?v=qxSJlvSwAhA
                      https://www.linkedin.com/in/dmitry-v...-identity-name

                      Comment


                      • #12
                        Originally posted by cecofuli View Post
                        IMO VRay-Rt GPU is a little behind the others. And, I've test here that, with the same noise lever, VRay-RT GPU is 1/2 slow.
                        Can you get me specific examples? My own tests show otherwise

                        Best regards,
                        Vlado
                        I only act like I know everything, Rogers.

                        Comment


                        • #13
                          Vlado - side note how does GPU ram actually works. Is it really that hard to expand it to 20gb or is it just nvidia making loads of $$ by doing it the slow way ?
                          CGI - Freelancer - Available for work

                          www.dariuszmakowski.com - come and look

                          Comment


                          • #14
                            Originally posted by DADAL View Post
                            Vlado - side note how does GPU ram actually works. Is it really that hard to expand it to 20gb or is it just nvidia making loads of $$ by doing it the slow way ?
                            I don't know really. There might be technical difficulties to put more RAM on the board.

                            Best regards,
                            Vlado
                            I only act like I know everything, Rogers.

                            Comment


                            • #15
                              i think the main issues are

                              a) gpu ram is necessarily very fast as it has to feed all those cores, and hence more expensive than standard ddr dimms.

                              b) games dont need it yet, and professional uses do, this gives nvidia /amd a nice easy way to differentiate their pro cards, and milk people of their cash.

                              as to why even the pro cards currently max out at 6 gb or so.. well maybe there are still technical issues running 24 gb of ram at 5 ghz... or maybe its really only us who would use so much (find that hard to believe though. ) not sure tbh.

                              Comment

                              Working...
                              X