Just installed an EVGA RTX 2080 Super that is paired up with my MSI GTX 970, and tried to do some tests to see the performance. I immediately I noticed that the 970 severely bottlenecks the 2080 when in RTX engine mode. For example:
RTX
2080 + 970 = 1 min 12 sec
2080 = 49 sec
And for comparative purposes:
CUDA
2080 = 1 min 16 sec
970 + 2080 = 1 min 10 sec
CPU + 970 + 2080 = 1 min 5 sec
Now I'm just going to assume that in RTX mode the 970 doesn't play nice for incompatibility reasons and it's a software issue. Which I can understand. But what I don't understand is how minimal the gains there are in CUDA mode when both cards are enabled. Is this to be expected on certain scenes? The 970 just old and outdated? Or something else going on. FYI I have all of the latest drivers and using 3ds max 2020 SP3 and V-Ray 4.30.
RTX
2080 + 970 = 1 min 12 sec
2080 = 49 sec
And for comparative purposes:
CUDA
2080 = 1 min 16 sec
970 + 2080 = 1 min 10 sec
CPU + 970 + 2080 = 1 min 5 sec
Now I'm just going to assume that in RTX mode the 970 doesn't play nice for incompatibility reasons and it's a software issue. Which I can understand. But what I don't understand is how minimal the gains there are in CUDA mode when both cards are enabled. Is this to be expected on certain scenes? The 970 just old and outdated? Or something else going on. FYI I have all of the latest drivers and using 3ds max 2020 SP3 and V-Ray 4.30.
Comment