Announcement

Collapse
No announcement yet.

Gelato from Nvidia... oooooooohh!!!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Gelato from Nvidia... oooooooohh!!!

    http://film.nvidia.com/page/gelato.html

    GPU rendering shading language.Cinema-Quality Images

    Gelato has been developed by adhering to a fundamental principle: never compromise on the quality of the final rendered image. As a result, Gelato delivers images that meet the most rigorous demands of the film industry.

    Key to this doctrine of no compromises is the nature of how Gelato uses the NVIDIA Quadro FX GPU. Instead of just using the native 3D engine in the GPU, as done in games, Gelato also uses the hardware as a second floating point processor. This allows Gelato to use hardware acceleration without being bounded by the limits of the 3D engine. Gelato is one of the first in a wave of software applications that use the GPU as an off-line processor, a "floating-point supercomputer on a chip," and not simply to manage what is displayed on the screen.

    Gelato includes:

    * Fast anti-aliasing
    * True displacement
    * Motion blur
    * Layered shaders
    * Full-range of geometric primitives
    * NURBS, bicubic, and bilinear patches
    * Polygon meshes
    * Subdivision surfaces
    * Hair
    * Particles
    * Plug-in API for procedural geometry
    * Automatic adaptive tessellation
    * Both scan-line and ray-traced rendering
    * Global illumination
    * Ambient occlusion
    * Image output can be 8-bit, 16-bit, and floating (per channel)
    * Output image channels for any value computed in shaders

  • #2
    what he said
    ____________________________________

    "Sometimes life leaves a hundred dollar bill on your dresser, and you don't realize until later that it's because it fu**ed you."

    Comment


    • #3
      There is no information on workflow or mechanics anywhere on the site.
      No screenshots of the gui or shading system, or lighting controlls for that matter. Does anyone know of someone who has experience with this system? I'd like to know more about its ropes in production and not som nvidia pr garble
      Signing out,
      Christian

      Comment


      • #4
        Here is a hint of what it will do.
        Eric Boer
        Dev

        Comment


        • #5
          A complete guess on my part but I think it is not a "real" program, but a shading language that is programed to work with the GPU. sorta like a RIB format. It would be up to a third party person to write a scene parser in the 3d package of choise to take advantage of it. Again... just a guess.

          Comment


          • #6
            Looks too complicated for me :P

            Oh, and I don't have a Quadro card

            -Matt

            Comment


            • #7
              Gelato includes:

              Cherry Flavor*
              Strawberry Flavor*
              Lemon-Lime*
              True Displacement

              *Artifically Flavored

              ...btw- wtf is true displacement? Is there something more true than general displacement? sounds like crazy wording to me...
              LunarStudio Architectural Renderings
              HDRSource HDR & sIBL Libraries
              Lunarlog - LunarStudio and HDRSource Blog

              Comment


              • #8
                it only works on linux at the moment... no good for us vrayers...

                (no good for my Quadro4 either...)
                when the going gets weird, the weird turn pro - hunter s. thompson

                Comment


                • #9
                  Personally, I think this is much bigger news then you think. We are talking about moving rendering from CPU to GPU. The GPU can raytrace a LOT faster then the CPU can. It can raytrace a LOT faster the vray can, I would say on the order of 100x. This is NOT a renderdrive solution which is very slow. That will be the future of rendering in a very short time. It is rumored that a lot of rendering engines are moving over to GPU raytracing, including Renderman... maybe even some familiar friends like MentalRay and Brazil...

                  Comment


                  • #10
                    I totally agree with cpnichols.
                    With the release of Shader 3.0 Graphics Hardware, it's advanced branching controls and full floating point precision, raytracing on the GPU will get very interesting for many rendering companies. The next Unreal engine introduces realtime raytracing effects and although a complete gi rendering will still need some time it will get significantly faster when the renderer supports GPU acceleration. Good to see that unused processing power is finally available for offline rendering.
                    Really curious how this affects the future development of Vray.
                    Sascha Geddert
                    www.geddart.de

                    Comment


                    • #11
                      Well, the guys at Worley have been doing this a long time with G2 and now Fprime
                      Nobody has seemed to make much fuss about it apart from a few die hard fans. Anyway, shading language, like for renderman and CG is not for people that can't program. Like using SLIM for PRman: you can make lots of standard shading trees manually, but you don't really get anywhere before you start coding them and tapping into their real power. I have a hunch the pricing of Gelato somehow signifies its intended user base, alot like Prman. Overpriced if you ask me, considering you have to build your own tools if you are serious.
                      Signing out,
                      Christian

                      Comment


                      • #12
                        Hey trixian - you're wrong. G2 and FPrime are totally based on the CPU. There is no GPU acceleration. Gelato is really the first GPU based Renderer out there and you can be sure there will be more in the future.
                        And how do you know it is overprized? A large company that can use GPU's as Renderslaves now might see this a little different and we don't even know how fast it is. Even if it's overprized - even if it needs a bunch of coders to work right, it's still pretty exciting because it's a glimpse of the things to come.
                        Sascha Geddert
                        www.geddart.de

                        Comment


                        • #13
                          Hmm.. I was convinced FPrime used the openGL card to handle a lot of its information, guess i was mistaken. (That makes Fprime a pretty fast renderer if its just a standard cpu based renderer).
                          As for the price (Gelato) being bloated, that is just my assumption atm since there is no known program to use it in, and my prior experience with cost of resources in a production pipeline using PRman (I recon these two products share alot of similarities regarding workflow). Very expensive in labour resources, licensing and render time (needs an extensive farm to be effective). That is why small studios tend to stay away from Prman and larger stusios embrace it. Renderman is like a very big and powerful machine that won't work properly if you don't through enough fuel and money at it. Plus you need a few engineers to get it working optimaly.

                          My 2 cents anyway...
                          Signing out,
                          Christian

                          Comment


                          • #14
                            Hehe. Just saw mental images announcement on MR 3.3 with GPU support. Guess its kinda more tangible now. Being shipped with Maya 6 and XSI 4.0.
                            Vlado is maybe getting a headache, with the iminent release of v.1.5?
                            Signing out,
                            Christian

                            Comment


                            • #15
                              Personally, I think this is much bigger news then you think. We are talking about moving rendering from CPU to GPU. The GPU can raytrace a LOT faster then the CPU can. It can raytrace a LOT faster the vray can, I would say on the order of 100x. This is NOT a renderdrive solution which is very slow. That will be the future of rendering in a very short time. It is rumored that a lot of rendering engines are moving over to GPU raytracing, including Renderman... maybe even some familiar friends like MentalRay and Brazil...
                              yep, i even think that in few years we will modeling RT scenes for a RT engine and rendering will be something oldfashoned.
                              Have a look at farcry! ..even the editor looks like max

                              I remember intergraphs renderGL which tries to use the gfx card to render phong faces. I also wouldn'd wonder if lightwave8 will have a much stronger hardware renderer.
                              www.cgtechniques.com | http://www.hdrlabs.com - home of hdri knowledge

                              Comment

                              Working...
                              X