I'm currently looking for a way to better estimate render costs or rather actual render time of a project in the offer phase.
When calculating, we do estimate how much we data prep, how much time for shading, how much for lighting, post etc. And also how many hours the project will render and then we multiply that number of hours with a fixed price per hour of rendertime to estimate actual rendering costs.
But since VRay is constantly changing, getting faster etc. pretty much the only way I could reliably estimate render costs is by taking values from experience and somehow get that in relation to current hardware, VRay versions etc.
Is there any smarter way of doing this besides having some kind of huge excel list with all the projects, rendertimes, hardware, software versions, etc. and then looking at it at a similar project in the past and try to eyeball it? How do you guys do this?
We render at our facility aswell as Rebus Farm fro bigger images. We render only stills for print, no animations of any kind.
When calculating, we do estimate how much we data prep, how much time for shading, how much for lighting, post etc. And also how many hours the project will render and then we multiply that number of hours with a fixed price per hour of rendertime to estimate actual rendering costs.
But since VRay is constantly changing, getting faster etc. pretty much the only way I could reliably estimate render costs is by taking values from experience and somehow get that in relation to current hardware, VRay versions etc.
Is there any smarter way of doing this besides having some kind of huge excel list with all the projects, rendertimes, hardware, software versions, etc. and then looking at it at a similar project in the past and try to eyeball it? How do you guys do this?
We render at our facility aswell as Rebus Farm fro bigger images. We render only stills for print, no animations of any kind.
Comment