I was able to run a this test on our quadro cards in the office. 8 quadro 4000 were used for the scene getting 69.5 seconds to reach 512 samples. Though some drivers were older on some of them so the render result was incorrect in color.
Announcement
Collapse
No announcement yet.
GPU benchmarks
Collapse
X
-
Last edited by Morbid Angel; 07-11-2012, 05:39 PM.Dmitry Vinnik
Silhouette Images Inc.
ShowReel:
https://www.youtube.com/watch?v=qxSJlvSwAhA
https://www.linkedin.com/in/dmitry-v...-identity-name
-
Last edited by Morbid Angel; 21-11-2012, 12:03 PM.Dmitry Vinnik
Silhouette Images Inc.
ShowReel:
https://www.youtube.com/watch?v=qxSJlvSwAhA
https://www.linkedin.com/in/dmitry-v...-identity-name
Comment
-
as vlado pointed out in the posts on first page, the driver version is important in speed and performance. I think our cards run a really old driver version (1 year old) so 14 cards perform as 4-5 gtx 580, I'd expect a lot more out of them. Will run another test tonight.Dmitry Vinnik
Silhouette Images Inc.
ShowReel:
https://www.youtube.com/watch?v=qxSJlvSwAhA
https://www.linkedin.com/in/dmitry-v...-identity-name
Comment
-
Here is a couple of interesting tests:
I let the 14 cards GPU mixed 8 quadro 6 GTX 560 go for 2 hours 76 minutes on the 2k frame. It reached sampling level of 34284. Image below (really good imo) I'd say that it could be good noise level in half that time.
I then let the same scene render on 14 machines using CPU RT mixed 4 12 2.6ghz core xeons, 4 4 3.2ghz core xeons, and 6 i7 at 3.6 ghz to reach the same sampling level. After 13 hours they get to 36904 samples, yet the image is no where near the same noise level as gpu, note the difference in reflections in particular on the green wall is much noisier then on gpu (on gpu its absent)
Lastly I let this scene render on my single GTX 480, (this one is in max however) newest nightly from last night, note the difference in shading on the glass and some other objects. I let it run for 2 hours 75 minutes (match the render of 14 cards) to compare the noise levels. While a lot noisier, I think its not bad for 1 card.
Last edited by Morbid Angel; 21-11-2012, 12:03 PM.Dmitry Vinnik
Silhouette Images Inc.
ShowReel:
https://www.youtube.com/watch?v=qxSJlvSwAhA
https://www.linkedin.com/in/dmitry-v...-identity-name
Comment
-
Hm, the GPU images are missing the reflective caustics on the green wall from the sunlit patch on the floor (which is what the noise in the CPU image actually is). I guess we need to look into it
Best regards,
VladoI only act like I know everything, Rogers.
Comment
-
Originally posted by vlado View PostHm, the GPU images are missing the reflective caustics on the green wall from the sunlit patch on the floor (which is what the noise in the CPU image actually is). I guess we need to look into it
Best regards,
VladoDmitry Vinnik
Silhouette Images Inc.
ShowReel:
https://www.youtube.com/watch?v=qxSJlvSwAhA
https://www.linkedin.com/in/dmitry-v...-identity-name
Comment
-
The GPU code is the same, but the standalone by default does only 3 GI bounces, whereas in 3ds Max they are set to more (10 or 100 - not sure right now). Quadro/GeForce doesn't make a difference.
Best regards,
VladoI only act like I know everything, Rogers.
Comment
-
Yep, there are a couple of odd things there; one is that the newer version of the nVidia CUDA compiler produces slower code for some reason; the other is that as we add more complicated GPU code, it seems to affect the overall performance even if the code is never executed. For example, just adding the code for the fractal Noise texture slows everything down, even if you don't actually use Noise textures. Also, in the original scene, there were a few ColorCorrection textures. The 2.30 build doesn't render them, but the nightlies do - and this slows them down a bit.
In other words, it's fun
Best regards,
VladoI only act like I know everything, Rogers.
Comment
-
Comment
Comment