Announcement

Collapse
No announcement yet.

3dsmax 2016 viewport speed has been crippled !

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 3dsmax 2016 viewport speed has been crippled !

    Just run test from http://forums.chaosgroup.com/showthr...U-and-results)
    And now in max 2016 it's 2x slower !!!

    Btw anyone knows how to turn off this bluish or yellowish border around objects when selecting ?
    Luke Szeflinski
    :: www.lukx.com cgi

  • #2
    Originally posted by lukx View Post

    Btw anyone knows how to turn off this bluish or yellowish border around objects when selecting ?
    Customize/Preferences/viewports/selection overlay

    Comment


    • #3
      Is it the new selection overlay that is slowing the viewport ?

      mekene

      Comment


      • #4
        ok I updated graphic drivers, and turned off overlay and it's not as previous twice slower just 2 seconds slower but still slower. So basically no progress in viewport department.
        Luke Szeflinski
        :: www.lukx.com cgi

        Comment


        • #5
          can some one explain how come Quadro cards, those discustingly expensive desks, perform so bad? Worse than gaming cards?
          That is so sad.
          Martin
          http://www.pixelbox.cz

          Comment


          • #6
            Originally posted by PIXELBOX_SRO View Post
            can some one explain how come Quadro cards, those discustingly expensive desks, perform so bad? Worse than gaming cards?
            That is so sad.
            I wish I knew before buying my quadro K4000

            Anyone interested?
            A.

            ---------------------
            www.digitaltwins.be

            Comment


            • #7
              It looks like you've found Autodesk's new feature for 3dsmax! Now they'll fix it in the next service pack so you feel like they're doing something...
              I remember having issues with the new Layer Manager in 2015. It opened and closed extremely slow with complex scenes. I think they fixed it eventually,
              but so far I haven't seen any reason to upgrade past 3dsmax 2012...
              Brendan Coyle | www.brendancoyle.com

              Comment


              • #8
                It find it very interesting how differently we view the various video cards and their relative performance. The quadro cards here perform consistently better than almost any game card, in our projects. I was hoping to get away with using a the new Titan, but our tests with the previous Titan left much to be desired and performed worse than our Quadro cards. It always makes me wonder what settings people are using that make the game cards run so well, when they absolutely sucked when I tested them out.

                I'm assuming it a combination of project size, memory size and 3DS Max viewport settings? Maybe settings within the nVidia control panel too?
                Troy Buckley | Technical Art Director
                Midwest Studios

                Comment


                • #9
                  I think the difference is mostly in working in wireframe and shaded viewports. I very rarely have wireframes on, and if I do it's usually a cage on the object i'm modelling. Most people I know who prefer quadro's leaves edged faces on all the time (or god forbid actually work in wireframe)

                  Comment


                  • #10
                    In 15 years I had 3 different quadro cards and on all of them the memory died so I'm staying with game ones
                    Luke Szeflinski
                    :: www.lukx.com cgi

                    Comment


                    • #11
                      I bought a Quadro once but never, ever, ever again. The performance to cost ratio against gaming cards is appalling. I'm amazed there's even still a market for them?
                      MDI Digital
                      moonjam

                      Comment


                      • #12
                        Originally posted by Neilg View Post
                        I think the difference is mostly in working in wireframe and shaded viewports. I very rarely have wireframes on, and if I do it's usually a cage on the object i'm modelling. Most people I know who prefer quadro's leaves edged faces on all the time (or god forbid actually work in wireframe)
                        Okay, that was funny! Hahahaha

                        I keep trying to go back to game cards, they just can't deliver for me. I don't really enjoy spending the money on a Quadro card either. I'm hoping the Titan cards are a middle ground, but I haven't found the courage to purchase one yet. Maybe the new Titan Z?
                        Troy Buckley | Technical Art Director
                        Midwest Studios

                        Comment


                        • #13
                          Originally posted by lukx View Post
                          In 15 years I had 3 different quadro cards and on all of them the memory died so I'm staying with game ones
                          Just curious, how do you know when the memory dies on a video card? Does it just stop working or do you start getting visual artifacts, glitches, etc?
                          www.dpict3d.com - "That's a very nice rendering, Dave. I think you've improved a great deal." - HAL9000... At least I have one fan.

                          Comment


                          • #14
                            Hmmm...good question.

                            I've had all types of cards get graphical glitches, I just assumed the whole card was just going bad. I've actually had more game cards die on me than Quadros. But that's probably due to the overclocked game cards I tend to buy, so they are already running out of spec.

                            Now that I run everything through a battery backup, things are MUCH better across the board.
                            Last edited by Donald2B; 23-04-2015, 07:18 AM. Reason: spelling
                            Troy Buckley | Technical Art Director
                            Midwest Studios

                            Comment


                            • #15
                              Originally posted by dlparisi View Post
                              Just curious, how do you know when the memory dies on a video card? Does it just stop working or do you start getting visual artifacts, glitches, etc?
                              yes, mostly strange artifacts and at the end of life lot of blue screens
                              Luke Szeflinski
                              :: www.lukx.com cgi

                              Comment

                              Working...
                              X