Probably not possible, but...
The way it is looking, using RT GPU will allow us to render scenes VERY quickly compared to what we have been used to. We will be tweeking and playing with settings, adding specular highlights, depth of field, bokeh effects etc etc until the cows come home. However, when we finally come to a production render, this could take many hours/days/weeks to render!
Is there any way that RT could predict roughly how long the current image would take to render using traditional none-RT rendering, at the present resolution and the local hardware? Obviously, this is dependant upon the quality of settings you will render at, but something for guidance might be useful.
The way it is looking, using RT GPU will allow us to render scenes VERY quickly compared to what we have been used to. We will be tweeking and playing with settings, adding specular highlights, depth of field, bokeh effects etc etc until the cows come home. However, when we finally come to a production render, this could take many hours/days/weeks to render!
Is there any way that RT could predict roughly how long the current image would take to render using traditional none-RT rendering, at the present resolution and the local hardware? Obviously, this is dependant upon the quality of settings you will render at, but something for guidance might be useful.
Comment