The 3090 is roughly same price as the cpu yeah, give or take a couple of hundred bucks....so this is the kind of general comparison i find useful when considering purchasing...thx bud.
Announcement
Collapse
No announcement yet.
CPU or GPU, I'm lost
Collapse
X
-
Originally posted by Alex_M View PostOk, so first results are in! I just got an RTX 3090. Soo.... yea. As I guessed, it is nowhere near 10-25 times faster than a 32/64-core CPU. At least not in the 1st test I did which is a production interior scene.
You can check the screenshot below showing the VFB history where you can see the render times. GPU finished in 6m 1.2s and CPU finished in 11m 4.3s which is not even 2 times faster (1.84 times to be exact).
Render settings were at default and Noise Threshold was set to 0.01 for both CPU and GPU. For the GPU rendering I used the faster RTX engine (not CUDA). Both the GPU and CPU were at stock (no overclock) at the time of the test. If I overclocked the CPU (which I normally do), the advantage of the RTX3090 would be even smaller. I'll probably do some more scenes when I have more free time. Like a production exterior for example.
Comment
-
Yeh, for me the speed gains arent worth the hassle of features not working, memory issues with displacements, and buggy drivers. Might not be such an issue with a 3090 and its large RAM size, but the feature set and bugs still keep me steering clear. Id also say its more of a fair comparison to compare the new Zen 3 threadrippers to the 30 series of Nvidia GPUs so we will see how they square up soon hopefully.Last edited by seandunderdale; 10-01-2021, 04:18 AM.
- Likes 1
Comment
-
My advice, for what it is worth, is to definitely not waste your money on a gpu unless you want it for gaming also.
There are simply too many pitfalls that you will encounter in trying to use it for anything other than quite simple or extremely specific projects, where you
absolutely know that it will do what you want with no surprises 2 thirds of the way through the project.
The really quite long list of stuff which isn't supported and still is being worked on, years after release as a full product, makes it a large risk.
In my opinion, gpu rendering is a strictly experimental tech and for the work I do, which is very varied, I am better off, as many have agreed, with a top notch cpu
and eliminating the frustration
- Likes 2
Comment
-
Originally posted by squintnic View Post
almost identical to my findings on same cpu / gpu comparision - only isse was gpu clapping out after 1-30 frames consistently, even with correct drivers and up to date software. would bug report it but in the heat of production taking out 2hrs to trouble shoot soemthing that will take 2-3 days to resolve isnt viable, especialyl in a time for money production enviroment.
That said, you can also OC the GPU and get easely 10%. Strictely in terms of power you can buy a 3080 for half the price and almost double the performances.
And still, don't forget that for the CPU you have to buy the CPU and a part of the RAM you don't need otherwise. A CPU plateforme is by very very far more expensive by "render time" than a CPU one !
Now to answer to fixeighted , I didn't had any issues you talk about during the past 2 years. Nothing that you can't perform by another method. Though, I'm not using some very specific features. Actually, it's quite the opposite, my workflow is now adapted to GPU, and if I switch a scene into CPU that's when I get problems
Comment
-
That's cool. If it works perfectly well within a given pipeline then that is fantastic and the speed will be most welcome.
It is just that every time I personally have tried to fit it in to one project or another it has tripped me up at some point, either immediately or more often quite a way
into it, which is rather frustrating, when time is an issue.
I've also spent a lot of time testing things which I've seen flagged here as possible issues and so I'm now quite clued up on what to avoid. Sadly, for me and the varied stuff I tend to do, it is quite a lot and
hence why I developed my opinion.
I would love nothing better than for it to be as well rounded a solution as cpu is, though that will be a long way off, so I will remain patient and watch with great anticipation
Comment
Comment