This is more of a question about 3Ds Max than V-Ray, but...
It's been a few months since I had to downgrade from a 980ti to a 1050ti and hadn't noticed a difference in viewport performance until today when I was working on a project that hit >40million polys, at which point I started getting a few hiccups. Which made me wonder how much the best GPUs out there might help or not when it comes to viewport performace, what's the point of diminishing returns, etc. So for someone like me who uses primarily the CPU renderer, it might be a good idea to skip buying the latest and greatest GPU next time I'm building a workstation.
People who have the Titans and 2080ti out there, what has your experience been like when it comes to viewport performance?
It's been a few months since I had to downgrade from a 980ti to a 1050ti and hadn't noticed a difference in viewport performance until today when I was working on a project that hit >40million polys, at which point I started getting a few hiccups. Which made me wonder how much the best GPUs out there might help or not when it comes to viewport performace, what's the point of diminishing returns, etc. So for someone like me who uses primarily the CPU renderer, it might be a good idea to skip buying the latest and greatest GPU next time I'm building a workstation.
People who have the Titans and 2080ti out there, what has your experience been like when it comes to viewport performance?
Comment