Announcement

Collapse
No announcement yet.

How does Lumion work?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • How does Lumion work?

    There is a new version out from Lumion, and some of the sample images looks pretty impressive. I'm not sure how much work lies behind some of these sample scenes, but render times are apparently in the minutes at most. It is a realtime, game style interface that looks fast based on youtube videos, and then it renders to create a better GI somehow. It kind of looks like working in Unreal speedwise, but with a builtin renderer.

    Is it something similar to Blender Eevee, where there is some form of fast viewport display, that is then enhanced? A totally different approach than Vray GPU? Are there any not so obvious limitations?

  • #2
    Yeah, it looks pretty impressive. https://www.youtube.com/watch?v=Gu6z6kIukgg&feature=youtu.be
    Bobby Parker
    www.bobby-parker.com
    e-mail: info@bobby-parker.com
    phone: 2188206812

    My current hardware setup:
    • Ryzen 9 5900x CPU
    • 128gb Vengeance RGB Pro RAM
    • NVIDIA GeForce RTX 4090
    • ​Windows 11 Pro

    Comment


    • #3
      I'm really curious whether the Vray team are seriously worried about something like this, or if they feel 'nah, it's a completely different market'? It is a real-time environment that makes it faster to place lights and materials, and then on top of it renders in minutes?? And looking at sample images, some of these really looks good enough for I would say 97% architectural needs.

      Surely there has to be a downside here I'm missing?

      Comment


      • #4
        i was questioning my self about 3d visualisation

        you know have so many renderers, yet with all those lighting, AA, glossy calculations, you would need a renderfarm, internal or external.
        cant be made on 1 pc, with current high demands of a client or studio. so you end up with comprimised noisy images.
        Even pixar who has a massive internal renderfarm, they say that some frames still take an hour too compute on animated movies.
        hard too believe since that renderfarm is huge.
        or they do it also in realtime?

        i made a thread here about realtime engines, and i now really cant tell the difference between realtime, or offline rendering

        https://forums.chaosgroup.com/forum/...line-rendering

        you where right with lumion, if this is in realtime, such an image would take a long time to render in any non realtime renderer.
        offcourse there is also the operator or 3d artist, i dont know how much is custom made. yet i know lumion also has an libary that you can use.
        https://lumion.com/showcase.html Click image for larger version  Name:	SnowinParisShowcaseExteriors.jpg Views:	2 Size:	1,009.8 KB ID:	1016947


        Attached Files
        Last edited by Robert Everhardt; 12-11-2018, 03:49 PM.

        Comment


        • #5
          I think if you are investing in GPU renderings, Lumion has to be looked at, along with Octane. I did some overload work for a company, about three years ago, and they were using Octane. They couldn't understand my 3+ hour render times, per view, when they were doing it in almost real-time. After I inquired, they dropped $15,000 on graphics cards for each workstation. At the time, if I remember correctly, it didn't support displacement, but I think it does now.
          Bobby Parker
          www.bobby-parker.com
          e-mail: info@bobby-parker.com
          phone: 2188206812

          My current hardware setup:
          • Ryzen 9 5900x CPU
          • 128gb Vengeance RGB Pro RAM
          • NVIDIA GeForce RTX 4090
          • ​Windows 11 Pro

          Comment


          • #6
            Right, but the Star Wars demo was made with extremely powerful graphics boards, with a renderer likely written to take advantage of the new features. From what I understand Chaos Lavina is similar as well, experimental code to use RTX technology.

            However, this doesn't apply to Lumion, it is using regular boards. In a way I kind of understand what's going on, they have a realtime engine based on a game engine similar to Unreal (probably something that ended up getting killed in the fight between Unreal and Unity) and then a clever rendering algorithm that somehow enhances the viewport by adding layers of AO, GI, etc. What I don't understand is why Vray GPU can't do something similar? Vlado has said that offline and gpu will converge, but is this a third track?

            I think it is amazing to have this real time viewport environment and quickly set up your scene, etc. but the biggie is these incredibly fast renders. Could we get a "Lumion' mode in the render?

            Comment


            • #7
              the biggest producers of animated cgi movies, like pixar,weta,disney
              avatar, viana, etc
              i think they also use shortcuts, hard too believe one frame take hours on a huge renderfarm
              maybe they also use algorimtes or do realtime rendering to speed things up, offcourse they wont share those "secrets"
              from the compition. like the Hyperion renderer

              here also an article i now found about avatar, that also implies it isnt just renderman used
              https://www.nvidia.com/object/wetadigital_avatar.html
              Last edited by Robert Everhardt; 12-11-2018, 04:32 PM.

              Comment


              • #8
                Originally posted by Robert Everhardt View Post
                Even pixar who has a massive internal renderfarm, they say that some frames still take an hour too compute on animated movies.
                hahah XD
                I read an interview with a Pixar guy. He was saying an average frame for inside out took 13 hours.
                But they don't really care because they're rendering thousands of frames at once.
                An hour would feel almost like realtime for big studios like pixar and ilm
                German guy, sorry for my English.

                Comment


                • #9
                  You can look at it from two sides. There is the cinematic side and the archviz side. I think for the archviz side, things are almost real-time with the correct hardware. I know a builder who uses Enscape and it is real-time.
                  Last edited by glorybound; 12-11-2018, 04:52 PM.
                  Bobby Parker
                  www.bobby-parker.com
                  e-mail: info@bobby-parker.com
                  phone: 2188206812

                  My current hardware setup:
                  • Ryzen 9 5900x CPU
                  • 128gb Vengeance RGB Pro RAM
                  • NVIDIA GeForce RTX 4090
                  • ​Windows 11 Pro

                  Comment


                  • #10
                    The difference is that Enscape looks really good for most people, but you clearly feel it would be mediocre if it was done in Vray, but some of the stuff they are showing from Lumion seems to hold up much better. Could the snow scene above be that much better in Vray? If the answer is no from most of us, then it is starting to become complicated. :/

                    Comment


                    • #11
                      Online rendering will replace offline rendering eventually, period. Nvidia already introduced hybrid rendering with their RTX lineup. Combining rasterization with ray tracing, delivering best of best worlds. Beyond that, the future will provide us with real time path tracing. With highly advanced denoising and upscaling algorithms already in development, it's rather clear where this is going.

                      I see a future Vray version with native support for the DXR / RTX rendering path.

                      Comment


                      • #12
                        https://www.youtube.com/watch?v=Gu6z6kIukgg this is the correct video. My god looks quite good for almost realtime. Those rainshots are insane.
                        A.

                        ---------------------
                        www.digitaltwins.be

                        Comment


                        • #13
                          Originally posted by Vizioen View Post
                          https://www.youtube.com/watch?v=Gu6z6kIukgg this is the correct video. My god looks quite good for almost realtime. Those rainshots are insane.
                          I've looked more at some of the videos at youtube and I'm not a fan of the interface, and there doesn't seem to be a livelink to 3ds Max, but just being able to slap on rain like that is scary. I think we are seeing the future here for architectural viz, and I'm sure Chaos is paying attention, would love to be a fly on the wall in their conference room.

                          I get the part with Lumion being based on a game engine, so all the realtime effects and the viewport environment is one thing, but what I don't get is how they can render an impressive end result like that in minutes if not seconds. I'm sure some of the more hardcore technical users here feel, 'well duh it's obvious dude, they've switched off this and that', but how could we do the same in Vray GPU? Like a 'draft' mode. I would much rather control a scene like the snow scene above directly in Max if I had that kind of rendering speed, or is this a function of light map baking, etc. so that our closest roadmap to something like this is to use Unreal?

                          Comment


                          • #14
                            It's really impressive, there's no mistaking it.
                            And they're *very* good at doing demos.
                            A glimpse of where they may be cutting corners is hidden in some of the comments to the videos, but it detracts nothing to the overall result: impressive, period.

                            We are paying attention, you can rest assured we know of most of what happens around, but we aren't exactly weaponless, ourselves.
                            We just walk a different path (hey, here's a pun for you!), which may in the end really converge to (near) realtime rendering.
                            Has any of you seen Lavina in the flesh, rather than online, and on a single 2080Ti?
                            With interiors too? ^^

                            Lele
                            Trouble Stirrer in RnD @ Chaos
                            ----------------------
                            emanuele.lecchi@chaos.com

                            Disclaimer:
                            The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                            Comment


                            • #15
                              I have not, but looking forward to it although it sounds as if it will take a while, given the technology demo status. Sounds like DXR is about to make life hard for CUDA. Would be wonderful to have an Lavina IPR in Vray.

                              Comment

                              Working...
                              X