Hi Guys
I understand the typical copy/paste response from ChaosGroup in these situations:
"Please note that CPU and GPU are completely different engines. We recommend, when you start a project to chose what type of engine will you use in the beginning and use that through the whole project."
....but this isnt a slight variance, this is a major different look.
On cpu, with a bump multiplier on a simple jpg texture of about 30, to get similar look on GPU, I have to lower the multiplier to about 0.05. The bump is so strong on GPU, that it even changes the colour of elements to much darker than what it should be (if I switch of bump, then the colour returns like it was when using CPU) Is this expected behaviour? Is there an INI or system wide setting I can adjust somewhere, so I dont have to adjust materials made over the past few years.
Alternartively, is there a know multiplicaton factor I can use in a script to change all my materials one shot?
I understand the typical copy/paste response from ChaosGroup in these situations:
"Please note that CPU and GPU are completely different engines. We recommend, when you start a project to chose what type of engine will you use in the beginning and use that through the whole project."
....but this isnt a slight variance, this is a major different look.
On cpu, with a bump multiplier on a simple jpg texture of about 30, to get similar look on GPU, I have to lower the multiplier to about 0.05. The bump is so strong on GPU, that it even changes the colour of elements to much darker than what it should be (if I switch of bump, then the colour returns like it was when using CPU) Is this expected behaviour? Is there an INI or system wide setting I can adjust somewhere, so I dont have to adjust materials made over the past few years.
Alternartively, is there a know multiplicaton factor I can use in a script to change all my materials one shot?
Comment