Announcement

Collapse
No announcement yet.

Project Lavina

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Project Lavina

    So we can finally share what we've been doing with DXR:
    https://www.chaosgroup.com/blog/ray-...project-lavina

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

  • #2
    Really good as first attempt!
    The main visual disturbing issue, from my point of view, is the ghosting effect, in particular at the start of the demo.

    Speaking about visual quality there are great improvements to do but I'm really confident that you already know and you're researching on it.... but I think that this is pure raytracing, without any kind of additional work for realtime.... I just say "finally!!!"

    Probably for the next years will see some blend solution raster-raytracing like 25 years ago with renderer, but looking what you achieved now I think that 5/8 years and there will be only Unreal or Unity full raytracing based engine.

    Thank you for sharing this and my bests with Lavina!!!

    I'm already impatient to see the next demo!

    Comment


    • #3
      I see it a bit as race. Who will be there first. Will the raster graphics, wich are realtime but heavily rely on fakes and approximations, reach maximum quality first. Or will raytracing, wich is photoreal and physically correct, reach acceptable speed first. It' s fantastic to see that you got on the racetrack, too Really looking forward how this evolves. I' ve seen many realtime raytracing and GI demos over the years. But as it comes from Chaosgroup it' s one of the first I take serious.

      Comment


      • #4
        Well,
        I have trouble understanding where this will slot in between V-Ray Adv, V-Ray GPU and IPR. What would be a standard usecase for this?
        https://www.behance.net/Oliver_Kossatz

        Comment


        • #5
          Great to see that you're working on this!
          German guy, sorry for my English.

          Comment


          • #6
            Originally posted by kosso_olli View Post
            What would be a standard usecase for this?
            The same you make now with Unreal Engine (or Unity or any other realtime engine)... but with the benefit to have the precision and quality of a raytracer and with low or anything middle work to prepare assets for realtime.



            Comment


            • #7
              n e a t.

              !

              Yeah I don't need that huge of a data set rendered real time - just let me drop into an interior scene for VR exploring without all the prep and transfer time to re-assemble in Unity/UE. But I wonder what 1 bounce of GI will look like for an interior...
              Brendan Coyle | www.brendancoyle.com

              Comment


              • #8
                I don't know but wouldn't this just replace IPR?
                A.

                ---------------------
                www.digitaltwins.be

                Comment


                • #9
                  Impressive. Most impressive. Will be keeping a very keen eye on this.

                  Comment


                  • #10
                    Originally posted by bardo View Post
                    The same you make now with Unreal Engine (or Unity or any other realtime engine)... but with the benefit to have the precision and quality of a raytracer and with low or anything middle work to prepare assets for realtime.
                    Where did you get that this is a unity/UE thing? It says in the blurb it's rendered from a .vrscene.

                    I am enjoying seeing things like this progress, but also v. aware that it's a $6k card and it'll be a lot time before I get my hands on it. Looking forward to seeing more demos though!

                    Comment


                    • #11
                      Originally posted by Neilg View Post

                      Where did you get that this is a unity/UE thing? It says in the blurb it's rendered from a .vrscene.

                      I am enjoying seeing things like this progress, but also v. aware that it's a $6k card and it'll be a lot time before I get my hands on it. Looking forward to seeing more demos though!
                      https://wccftech.com/nvidia-geforce-...turing-launch/

                      Best,
                      Blago.
                      V-Ray fan.
                      Looking busy around GPUs ...
                      RTX ON

                      Comment


                      • #12
                        I guess this is what Vlado mentioned "Unreal in 3dsMax viewport" thing.

                        Comment


                        • #13
                          For now this is just an exploration on what that RTX technology can bring for us. Like you, I was also curious how far it can be pushed and what kind of results can be expected, and the best way to do that is to try and actually code something. I'm not quite sure where this will bring us yet. Yes, we can trace 5 rays per pixel instead of 1, in realtime, and that helps, but final production quality requires thousands of rays. We wanted to see if denoising can fill this gap to some extent. Things can definitely be improved and there is a lot of research going on this area.

                          ​​​​​​We are also obviously exploring RTX in terms of V-Ray GPU as well, but it will take a bit of time to adjust the code for this type of hardware acceleration.

                          Best regards,
                          Vlado
                          I only act like I know everything, Rogers.

                          Comment


                          • #14
                            This looks very promising for the future .

                            Does someone knows how the RTX 10 Giga rays/s compare with current cpu (any reference) ?
                            Gil Guminski
                            cynaptek.com

                            Comment


                            • #15
                              Originally posted by vlado View Post
                              For now this is just an exploration on what that RTX technology can bring for us. Like you, I was also curious how far it can be pushed and what kind of results can be expected, and the best way to do that is to try and actually code something. I'm not quite sure where this will bring us yet. Yes, we can trace 5 rays per pixel instead of 1, in realtime, and that helps, but final production quality requires thousands of rays. We wanted to see if denoising can fill this gap to some extent. Things can definitely be improved and there is a lot of research going on this area.

                              We are also obviously exploring RTX in terms of V-Ray GPU as well, but it will take a bit of time to adjust the code for this type of hardware acceleration.

                              Best regards,
                              Vlado
                              So.... this is absolutely different than "traditional" GPU rendering, right?

                              A few more question if you don't mind...

                              1) Could this utilize max legacy shaders on CPU? or it has the same limitation as any GPU render?

                              2) Is it still limited by Vram?

                              3) Arnold showed a checkbos for GPU accellation before. I guess this is that. Is that mean this tech is for VRay CPU? "also obviously exploring RTX in terms of V-Ray GPU" seems impling that.

                              My brain hurts...

                              Comment

                              Working...
                              X