I'm going to try my very best at explaining my issue and the hopefully someone will be able to understand what I'm talking about.
I've been able to narrow down the cause through testing various different scenarios and it seems to me that enabling GPU Preview creates a large CPU load and an audible noise (physically grind from the CPU not a motherboard beep or speaker noise). The noise corresponds with any action CPU spike generated by the GPU Preview.
Example 01:
When simulating with GPU Preview on there is a large uptick in the CPU usage after the frame has elapsed and the preview had been created in the viewport. With each spike there is a grind noise. This continues with each frame as a new preview is rendered in the viewport. CPU usage goes from about 1% to 50% when noise occurs.
Example 02:
Once a simulation is complete I encounter the same spikes and noise when moving a light source around the scene (because the preview needs to be re-rendered each time).
The CPU spike and noise when moving the light disappear when I uncheck "lighting" from the GPU preview. However, the issue still remains when a new frame is created or when a new preview is generated in the viewport via scrubbing the animation.
I know the solution is to disable the GPU Preview because everything else works as it should when it is off. I would like to understand why the preview is causing this issue because I really like this feature in the 3.0 version and just turning it off doesn't seem like a proper fix. This is the only time I have ever experience this with my computer in the 9 months it has been in operation. I can render at 100% usage and never hear a grind noise. I can run games at full GPU usage and hear no noise. I can render and run games (with massive lag), but still no noise. Moving objects in the viewport or doing anything else will not cause the CPU spike and noise unless it directly effects the way the simulation looks in the viewport with GPU Preview enabled.
Here are my relevant specs:
CPU: i7-5960x @ 3.00Ghz
GPU: Quadro M4000
Motherboard: X99-A
RAM: 60GB DDR4
Windows 10
Max 2017 SP2
VRay 3.40.02
Phoenix FD 3.00.01
As for the scene I have created the same scene about 5 times just to make sure, I can upload it if needed but its just a sphere with any Phoenix FD setup, a VRay Sun and GPU preview enabled. Simulate to 100 and notice a CPU spike and noise with each frame elapsed for example 01 and then move around the light or scrub the scene to experience example 02.
I wish there was a better way to describe the noise, I would have never noticed this if it was not for how loud it was (I can try and record). If you google similar CPU noises people point to all sorts of failures from motherboards to power supplies; I don't think this is the case. For now I am just disabling the GPU Preview. Any reason as to why there would be such a high CPU draw for updating a GPU preview in the viewport?
Thank you in advance.
I've been able to narrow down the cause through testing various different scenarios and it seems to me that enabling GPU Preview creates a large CPU load and an audible noise (physically grind from the CPU not a motherboard beep or speaker noise). The noise corresponds with any action CPU spike generated by the GPU Preview.
Example 01:
When simulating with GPU Preview on there is a large uptick in the CPU usage after the frame has elapsed and the preview had been created in the viewport. With each spike there is a grind noise. This continues with each frame as a new preview is rendered in the viewport. CPU usage goes from about 1% to 50% when noise occurs.
Example 02:
Once a simulation is complete I encounter the same spikes and noise when moving a light source around the scene (because the preview needs to be re-rendered each time).
The CPU spike and noise when moving the light disappear when I uncheck "lighting" from the GPU preview. However, the issue still remains when a new frame is created or when a new preview is generated in the viewport via scrubbing the animation.
I know the solution is to disable the GPU Preview because everything else works as it should when it is off. I would like to understand why the preview is causing this issue because I really like this feature in the 3.0 version and just turning it off doesn't seem like a proper fix. This is the only time I have ever experience this with my computer in the 9 months it has been in operation. I can render at 100% usage and never hear a grind noise. I can run games at full GPU usage and hear no noise. I can render and run games (with massive lag), but still no noise. Moving objects in the viewport or doing anything else will not cause the CPU spike and noise unless it directly effects the way the simulation looks in the viewport with GPU Preview enabled.
Here are my relevant specs:
CPU: i7-5960x @ 3.00Ghz
GPU: Quadro M4000
Motherboard: X99-A
RAM: 60GB DDR4
Windows 10
Max 2017 SP2
VRay 3.40.02
Phoenix FD 3.00.01
As for the scene I have created the same scene about 5 times just to make sure, I can upload it if needed but its just a sphere with any Phoenix FD setup, a VRay Sun and GPU preview enabled. Simulate to 100 and notice a CPU spike and noise with each frame elapsed for example 01 and then move around the light or scrub the scene to experience example 02.
I wish there was a better way to describe the noise, I would have never noticed this if it was not for how loud it was (I can try and record). If you google similar CPU noises people point to all sorts of failures from motherboards to power supplies; I don't think this is the case. For now I am just disabling the GPU Preview. Any reason as to why there would be such a high CPU draw for updating a GPU preview in the viewport?
Thank you in advance.
Comment