Announcement

Collapse
No announcement yet.

nVidia 1080 ti

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Originally posted by glorybound View Post
    That NVlink sounds like something to wait for. I wonder if it'll be backwards compatible, or it'll need a new kind of card.
    Beside the software support (which we already have in V-Ray GPU) it needs additional hardware, which at the moment is available only in Quadro GP100.
    V-Ray fan.
    Looking busy around GPUs ...
    RTX ON

    Comment


    • #47
      Asus release three different 1080Ti :
      GTX1080TI-FE : https://www.asus.com/Graphics-Cards/GTX1080TI-FE/
      TURBO-GTX1080TI-11G : https://www.asus.com/Graphics-Cards/...GTX1080TI-11G/
      ROG-STRIX-GTX1080TI-O11G-GAMING : https://www.asus.com/Graphics-Cards/...I-O11G-GAMING/

      When we compare, apparently, no real hardware difference :
      1x fan for the Founders Edition.
      1x fan too for TURBO Edition but it's dual-ball Bearing Fan (less noise?)
      and finally,
      3x fans for ROG-GAMING edition, higher and wider than the others (2.5-Slot Width)

      but no overclocking ?
      Last edited by Raph4; 14-03-2017, 12:38 AM.

      Comment


      • #48
        OC'ing should accompany the 1080 ti...I read tests done at 2 GHz in order to get the 30-35% advantage over the 1080. I don't know if I would spend the $ if I couldn't OC them.

        Comment


        • #49
          Originally posted by biochemical_animations View Post
          OC'ing should accompany the 1080 ti...I read tests done at 2 GHz in order to get the 30-35% advantage over the 1080. I don't know if I would spend the $ if I couldn't OC them.
          overclocking just introduces more problems for graphic cards so better to stay at stock, especially when you have issues with the card and need to send it back, they can deny fixing it
          Architectural and Product Visualization at MITVIZ
          http://www.mitviz.com/
          http://mitviz.blogspot.com/
          http://www.flickr.com/photos/shawnmitford/

          i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

          Comment


          • #50
            Originally posted by mitviz View Post
            overclocking just introduces more problems for graphic cards so better to stay at stock, especially when you have issues with the card and need to send it back, they can deny fixing it
            Depends on the brand. Usually EVGA will be glad to take OC damaged card. I think its even in ther RMA TOS that u can send OC damaged hardware.

            OCing GPU for rendering is a lot different to OCing gpu for gaming. If u OC for rendering u don't have too much to worry about - at least last I checked. You can also hit higher OC than for games if u just will render it. < might be wrong but last time I tested I could get a lot more out of my GPU when I only used it for rendering and not games.
            CGI - Freelancer - Available for work

            www.dariuszmakowski.com - come and look

            Comment


            • #51
              Originally posted by Dariusz Makowski (Dadal) View Post
              Depends on the brand. Usually EVGA will be glad to take OC damaged card. I think its even in ther RMA TOS that u can send OC damaged hardware.

              OCing GPU for rendering is a lot different to OCing gpu for gaming. If u OC for rendering u don't have too much to worry about - at least last I checked. You can also hit higher OC than for games if u just will render it. < might be wrong but last time I tested I could get a lot more out of my GPU when I only used it for rendering and not games.
              actually there i think your wrong, rendering i think pushes gpus to the limit while games not so much, tried with the most punishing game benchmarks, highest settings etc gpu temps hardly move and the clock speeds are usually ok, when rendering however temps go up higher as well and the clock speeds and if your planning to overclock better have cooling, good cooling, at stock you can render and have no issues once your gpus arent close but once you plan to overclock you better be cooling which costs
              Architectural and Product Visualization at MITVIZ
              http://www.mitviz.com/
              http://mitviz.blogspot.com/
              http://www.flickr.com/photos/shawnmitford/

              i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

              Comment


              • #52
                Originally posted by Raph4 View Post
                Asus release three different 1080Ti :
                GTX1080TI-FE : https://www.asus.com/Graphics-Cards/GTX1080TI-FE/
                TURBO-GTX1080TI-11G : https://www.asus.com/Graphics-Cards/...GTX1080TI-11G/
                ROG-STRIX-GTX1080TI-O11G-GAMING : https://www.asus.com/Graphics-Cards/...I-O11G-GAMING/

                When we compare, apparently, no real hardware difference :
                1x fan for the Founders Edition.
                1x fan too for TURBO Edition but it's dual-ball Bearing Fan (less noise?)
                and finally,
                3x fans for ROG-GAMING edition, higher and wider than the others (2.5-Slot Width)

                but no overclocking ?

                A little more info for all Asus 1080Ti.

                ASUS GTX 1080 Ti Founders Edition & ASUS TURBO GTX 1080 Ti & ASUS ROG STRIX GTX 1080 Ti GAMING
                https://www.techpowerup.com/gpudb/28...ce-gtx-1080-ti
                https://www.techpowerup.com/gpudb/b4...bo-gtx-1080-ti
                https://www.techpowerup.com/gpudb/b4...1080-ti-gaming

                GPU Base Clock: 1480 Mhz
                GPU Boost Clock: 1582 MHz
                Memory Clock: 1376 MHz

                Render Config
                Pixel Rate: 130.2 GPixel/s
                Texture Rate: 332 GTexel/s
                Floating-point performance: 10,609 GFLOPS

                --

                ASUS ROG STRIX GTX 1080 Ti GAMING OC
                https://www.techpowerup.com/gpudb/b4...0-ti-gaming-oc
                GPU Base Clock : 1594 MHz (+7.7%)
                GPU Boost Clock : 1708 MHz (+8.0%)
                Memory Clock : 1388 MHz (+0.9%)

                Render Config
                Pixel Rate: 138.1 GPixel/s
                Texture Rate: 351 GTexel/s
                Floating-point performance: 11,247 GFLOPS

                --

                One 1080Ti OC @ 2.5Ghz beat world record from Titan X (Pascal) on 3DMark.

                GeForce GTX 1080 Ti overclocked to 2.5GHz sets world record
                http://www.pcgamer.com/geforce-gtx-1...-world-record/
                Last edited by Raph4; 16-03-2017, 05:41 AM.

                Comment


                • #53
                  Originally posted by mitviz View Post
                  actually there i think your wrong, rendering i think pushes gpus to the limit while games not so much, tried with the most punishing game benchmarks, highest settings etc gpu temps hardly move and the clock speeds are usually ok, when rendering however temps go up higher as well and the clock speeds and if your planning to overclock better have cooling, good cooling, at stock you can render and have no issues once your gpus arent close but once you plan to overclock you better be cooling which costs
                  MIT, have you tried one of those amfeltec splitter/clusters? I'm wondering if they would work on a regular MB and 28 lane cpu. My thinking, leave a card in the MB for the display running at 8x, and use a splitter/cluster in another slot adding 4 cards running at 4x. I've read guys using these on high end mb's, but I'm wondering if the high MB is necessary just for 5-6 card WS.

                  Comment


                  • #54
                    Originally posted by biochemical_animations View Post
                    MIT, have you tried one of those amfeltec splitter/clusters? I'm wondering if they would work on a regular MB and 28 lane cpu. My thinking, leave a card in the MB for the display running at 8x, and use a splitter/cluster in another slot adding 4 cards running at 4x. I've read guys using these on high end mb's, but I'm wondering if the high MB is necessary just for 5-6 card WS.
                    no never heard of these until now, gona look into it
                    Architectural and Product Visualization at MITVIZ
                    http://www.mitviz.com/
                    http://mitviz.blogspot.com/
                    http://www.flickr.com/photos/shawnmitford/

                    i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

                    Comment


                    • #55
                      Hey MIT, did you have to do anything special to get those usb risers to work? I'm testing one, I have one card in the 16x MB slot, and another card in the riser...windows 7 reports errors with riser card. I've tried two different risers and two different cards in the riser, also tried a different slot...same thing. Windows doesn't like the riser card.

                      Comment


                      • #56
                        Originally posted by biochemical_animations View Post
                        Hey MIT, did you have to do anything special to get those usb risers to work? I'm testing one, I have one card in the 16x MB slot, and another card in the riser...windows 7 reports errors with riser card. I've tried two different risers and two different cards in the riser, also tried a different slot...same thing. Windows doesn't like the riser card.
                        no i didnt do anything special, you have the 60 cm usb 3 risers? let me see which one, also make sure you update your graphics drivers, for me i had to do nothing special just worked
                        Architectural and Product Visualization at MITVIZ
                        http://www.mitviz.com/
                        http://mitviz.blogspot.com/
                        http://www.flickr.com/photos/shawnmitford/

                        i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

                        Comment


                        • #57
                          I have been waiting for Virtual GPU technology to really happen. Is this even a reality?

                          http://www.nvidia.com/object/grid-technology.html
                          Bobby Parker
                          www.bobby-parker.com
                          e-mail: info@bobby-parker.com
                          phone: 2188206812

                          My current hardware setup:
                          • Ryzen 9 5900x CPU
                          • 128gb Vengeance RGB Pro RAM
                          • NVIDIA GeForce RTX 4090 X2
                          • ​Windows 11 Pro

                          Comment


                          • #58
                            Originally posted by mitviz View Post
                            no i didnt do anything special, you have the 60 cm usb 3 risers? let me see which one, also make sure you update your graphics drivers, for me i had to do nothing special just worked
                            I forgot to plug in power to the card...I powered the adapter and the card's fans came on, windows detected and loaded the driver but reported a problem...once I realized I forgot to power the card, it worked.

                            I'm dumb!

                            Comment


                            • #59
                              Originally posted by biochemical_animations View Post
                              I forgot to plug in power to the card...I powered the adapter and the card's fans came on, windows detected and loaded the driver but reported a problem...once I realized I forgot to power the card, it worked.

                              I'm dumb!
                              haha well good you got it
                              Architectural and Product Visualization at MITVIZ
                              http://www.mitviz.com/
                              http://mitviz.blogspot.com/
                              http://www.flickr.com/photos/shawnmitford/

                              i7 5960@4 GHZm, 64 gigs Ram, Geforce gtx 970, Geforce RTX 2080 ti x2

                              Comment


                              • #60
                                Originally posted by glorybound View Post
                                I have been waiting for Virtual GPU technology to really happen. Is this even a reality?

                                http://www.nvidia.com/object/grid-technology.html
                                It's for virtualised desktops, Nvidia and AMD both have their own technology in this field. Started with Nvidia creating their own cards with keplar (not too different from the Tesla cards, but with drivers optimised to partition the GPU between virtual machines), haven't herd anything on pascal though. All in all it's not efficient and a very expensive way of providing gpu resources to a virtual machine, if your aim is to run vray. You would be much better off using the cloud resources from Amazon, Microsoft or more conveniently a render farm.

                                Comment

                                Working...
                                X