Announcement

Collapse
No announcement yet.

reflection/specular bug in GPU?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • reflection/specular bug in GPU?

    ive been going mental trying to get some nice specular highlights on my tree leaves in GPU, cranking up the reflection amount higher and higher and not getting the expected bling.

    swapped over to cpu and pop! there are the (now very overexaggerated) speculars i was missing.

    materials are fairly standard, with a vray2sided, with vraymaterial, and a map at low percentage in the ref slot, and a mid gray in the ref colour.


    vray 3.6


    Attached Files

  • #2
    ok so, very wierd, it appears to be the bump map.

    stripping out he material and applying it to a sphere, you can see that the gpu renders the bump waaaay stronger than the cpu.

    so, its breaking up the specular much more. see attached, gpu is the right hand render.


    not sure if its important, im using vrayhdri as the map loader with jpg maps. elliptical filtering.
    Attached Files

    Comment


    • #3
      Nobody? Im noticing the difference everywhere in my scene now.. Problematic because im on the limit of my GPU ram. If job gets much more complex ill have to shift to cpu, meaning re tweaking dozens of textures.

      Comment


      • #4
        I think this is pretty normal. If you want better reflection/specular, use better texture or make filtering little bit higher for bump. After that this noise in texture wont be so visible. But reflection or specular is still the same.
        AMD TR 7980X, 256GB DDR5, GeForce RTX 4090 24GB, Win 10 Pro
        ---------------------------
        2D | 3D | web | video
        jiri.matys@gmail.com
        ---------------------------
        https://gumroad.com/jirimatys
        https://www.artstation.com/jiri_matys
        https://www.youtube.com/channel/UCAv...Rq9X_wxwPX-0tg
        https://www.instagram.com/jiri.matys_cgi/
        https://www.behance.net/Jiri_Matys
        https://cz.linkedin.com/in/jiří-matys-195a41a0

        Comment


        • #5
          I cant see how it is normal that bump is massively stronger when rendering with gpu than cpu..?

          Comment


          • #6
            Originally posted by super gnu View Post
            I cant see how it is normal that bump is massively stronger when rendering with gpu than cpu..?
            Although it might seem strange there are multiple way bump can be computed. They are differently “correct”, bump is always approximation.

            CPU and GPU have in some cases different bump results because they use different algorithms.

            Best,
            Blago.
            V-Ray fan.
            Looking busy around GPUs ...
            RTX ON

            Comment


            • #7
              i appreciate that the methods are different, but how can needing a bump multiplier of 30 on cpu and 1-2 on gpu be "differently correct" since the idea is to match gpu to cpu whenever possible, it would seem sensible to try to improve this.. ive not got any special circumstances in my scene.. greyscale 8 bit bump maps in normal vray materials, with a vray sun and sky.

              as it is my scene is just about to exceed my 12gb ram limit on gpu (im currently rendering at 99.8 % gpu ram usage and ive got more stuff to add) so ill be forced to switch to cpu rendering.. this means going through my scene material by material (dozens)readjusting all the bump values (if a global change doesnt work)

              if its a simple case of multiplying all the err, multipliers by a fixed amount then maybe it can be scripted.
              Last edited by super gnu; 11-05-2018, 02:24 PM.

              Comment


              • #8
                I understand that this is awkward. In this case, it is not trivial to match the two renders easily.
                We don't suggest to try to match the CPU and GPU renders. It is better to start with the engine of choice from start.

                With that being said, in the current situation there might be a way to make them somewhat closer.

                Best,
                Blago.
                V-Ray fan.
                Looking busy around GPUs ...
                RTX ON

                Comment


                • #9
                  well recently i always start with GPU, and most of the time its ok.. my cpu is getting a bit past-it now. however if a client adds an ornamental garden in the foreground halfway through the job, switching becomes a necessity, unless i can find someone with a 16gb gpu to render on.. and maybe that wouldnt be enough. when will nvidia make an affordable 128gb gpu? :P

                  Comment


                  • #10
                    id like a reasonably priced (i.e. titan x prices max) gpu with ram slots. a nice 16 channel memory controller and loads of small dimms. that way we could upgrade

                    as it is the pricing of the v100 and other 16gb/nvlink cards is so high its just not cost competitive with cpu..

                    Comment


                    • #11
                      If you use V-Ray GPU on CPU (Hybrid), the results will always match.

                      Best,
                      Blago.
                      V-Ray fan.
                      Looking busy around GPUs ...
                      RTX ON

                      Comment


                      • #12
                        now that is an idea.. i can just set it to only use the cpu. nice back-up plan. thanks!

                        out of interest, how does the speed of a given cpu in rt gpu cuda compare to the same in adv with similar settings/result

                        Comment

                        Working...
                        X