Announcement

Collapse
No announcement yet.

NVIDIA TITAN X (2016) (Powered by Pascal)

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA TITAN X (2016) (Powered by Pascal)

    NVIDIA announces new TITAN X (Pascal) as best graphic card of the world, but for one crazy price...



    Release date : 2 August 2016
    http://www.geforce.com/hardware/10series/titan-x-pascal
    http://www.geforce.com/whats-new/art...ble-august-2nd


    Want buy ?

  • #2
    Meh... Overpriced, in europe this will cost 1400-1500€.. no thanks... only 12GB too, where's 16GB? Sweet spot imho will be the 1080Ti hopefully at a more reasonable price (800-900€ range).

    Comment


    • #3
      Massive disappointment, apart from not being 16GB the rumors I read all listed above 6,000 cores.

      Comment


      • #4
        I have find this comparaison on YouTube :

        TITAN X (Pascal) vs GTX TITAN X vs 1080
        Click image for larger version

Name:	Nvidia Titan X Pascal vs GeForce GTX Titan X vs GTX 1080 Technische -00001.jpg
Views:	1
Size:	437.6 KB
ID:	862786
        * power for TITAN X (Pascal) it's 6+8Pin and not 6x8Pin


        Other comparisons here:
        https://www.youtube.com/watch?v=uLjjUVXLqMM


        Need wait real banchmark, but it's more interesting to buy 2x 1080.
        I do not understand why NVIDIA wants to sell TITAN X at such a high price.
        Last edited by Raph4; 22-07-2016, 11:01 PM.

        Comment


        • #5
          In terms of cores for sure, and I really wished they had gone for 16GB.

          Comment


          • #6
            Originally posted by Nicinus View Post
            In terms of cores for sure, and I really wished they had gone for 16GB.
            Its getting really old that every year they "announce" the best card ever. Then next year this card is as good as gone, and a new "best card" is announced.
            Dmitry Vinnik
            Silhouette Images Inc.
            ShowReel:
            https://www.youtube.com/watch?v=qxSJlvSwAhA
            https://www.linkedin.com/in/dmitry-v...-identity-name

            Comment


            • #7
              Originally posted by Morbid Angel View Post
              Its getting really old that every year they "announce" the best card ever. Then next year this card is as good as gone, and a new "best card" is announced.
              But it is this 'iphone style improve by small increments' that is especially frustrating. If it was a more noticeable advance I would be fine, like 5-6,000 cores and 16Gb, I would then come back in a couple of years and buy again but apparently that is not good enough for them. Their revenue model requires at least biannual upgrades from the user base. Or annual when I think about it.
              Last edited by Nicinus; 24-07-2016, 05:01 PM.

              Comment


              • #8
                Originally posted by Morbid Angel View Post
                Its getting really old that every year they "announce" the best card ever. Then next year this card is as good as gone, and a new "best card" is announced.
                Well I have 7 Titan X from the Maxwell architecture since march 2015 and it's still one of the best cards as of today. Pretty much the same speed as the 1080 (it's only 3% faster) and the new titan X will be only 20% faster than the 1080 with 12Gb of ram.
                This means the new top of the line GPU will be like 20-25% faster than my Titan X with the same Ram but I have mine OC +25% so it's essentially exactly the same.
                It will come out in August 2016 so 1 year and a half after I acquired my Titan X.
                If the release after this comes out like 1.5 year after (like around march 201 then my Titan X will be nearly as fast but with the same ram as the best GPU available for 3 years.
                If that next release is not a major push forward either, well then my Titan X will still be good to use.

                What computer can you buy today and say that in 3 years it will still near the top of the line available?

                I know there are a lot of press releases and fuzz about all this, but when you look at the numbers, it's not such a big deal after all and not such a bad investment.
                There will always be something better in 10 years and you can always wait to jump on the wagon, but then we would still be on max 2008 and VRay 1.47
                Are you still on max 2008 and vray 1.47? no, well then you made the jump, good for you, why with everything else you do it but not with GPUs?
                Last edited by Sbrusse; 28-07-2016, 03:32 AM.
                Stan

                Comment


                • #9
                  nvidia isnt making these cards for us, the performance over the last few generations has not improved very much for RT. Wasnt the 700 series as good as the 900 series?
                  WerT
                  www.dvstudios.com.au

                  Comment


                  • #10
                    Originally posted by werticus View Post
                    nvidia isnt making these cards for us, the performance over the last few generations has not improved very much for RT. Wasnt the 700 series as good as the 900 series?
                    That's not entirely true; the 1080 is almost twice as fast as my 980 card

                    Best regards,
                    Vlado
                    I only act like I know everything, Rogers.

                    Comment


                    • #11
                      What is the story on SLI nowadays? Will I get a benefit if I have two 1080 or will it cause issues? I assume I want it for VR and general viewport speed, but previously I remember some concern with RT?

                      Comment


                      • #12
                        Originally posted by Nicinus View Post
                        What is the story on SLI nowadays? Will I get a benefit if I have two 1080 or will it cause issues? I assume I want it for VR and general viewport speed, but previously I remember some concern with RT?
                        V-Ray will be able to use any number of GPUs in the system quite efficiently; there's no need to set up SLI for this. I don't think it matters whether it's on or off anymore.

                        Best regards,
                        Vlado
                        I only act like I know everything, Rogers.

                        Comment


                        • #13
                          Thanks, but it still makes a difference for viewport performance and things like VR, right? My understanding was that SLI makes the system see the two boards as basically one with double (or higher) performance? Or do I get any viewport performance out of the second board either way? All a bit confusing to me.

                          Comment


                          • #14
                            Originally posted by Nicinus View Post
                            Thanks, but it still makes a difference for viewport performance and things like VR, right? My understanding was that SLI makes the system see the two boards as basically one with double (or higher) performance? Or do I get any viewport performance out of the second board either way? All a bit confusing to me.
                            Hm, I don't know about that part to be honest. Perhaps someone else who has tried it will chime in.

                            Best regards,
                            Clado
                            I only act like I know everything, Rogers.

                            Comment


                            • #15
                              Have tried it with 2 and 3 way sli - using 980 ti boards. For me there was no difference with viewport performance. I did this more for a blind test to check if having sli enabled made max less stable - this was to allow clients to switch between game engines and max without changing any settings. From what I understood, to gain anything from sli, the application (in this case nitrous) needs to be specifically written (to render alternative/split frames) as well as there being support from the nVidia driver. I know there are other things that can be split between multiple cards but as far as I know there is no support - have a vague recollection that autodesk even discourages having sli enabled at all.

                              Comment

                              Working...
                              X