Announcement

Collapse
No announcement yet.

Lytro announces Immerge VR syste

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Lytro announces Immerge VR syste

    http://cgpress.org/archives/lytro-announces-immerge-vr-system.html

    Any news on be able to render out all that data with Vray? and how to play it back?

    Stan
    3LP Team

  • #2
    The guys from Nozon have a system for that too, they built their own camera rig too though it's only stills not motion. Should be very possible to do with vray, the technique shares a few ideas with the shade map. Rendering motion for it could be a huge pain in the ass though since you're rendering multiple cameras from spherically offset positions so it's a case of how many different camera positions would be enough to cover you. The shade map should cut down calculations a lot but I'd say it'd still be nasty!

    Comment


    • #3
      Originally posted by joconnell View Post
      The guys from Nozon have a system for that too, they built their own camera rig too though it's only stills not motion. Should be very possible to do with vray, the technique shares a few ideas with the shade map. Rendering motion for it could be a huge pain in the ass though since you're rendering multiple cameras from spherically offset positions so it's a case of how many different camera positions would be enough to cover you. The shade map should cut down calculations a lot but I'd say it'd still be nasty!
      Yes for Nozon, it seems they use some sort of point cloud scene but the parallax seems to be pretty limited.
      We spoke about all this on the forum few months ago, I was just wondering if CG had made any progress in this field yet

      Stan
      3LP Team

      Comment


      • #4
        The lytro rig is pretty much the same - the amount of parallax is tied into the radius of the sphere your camera rig is covering unfortunately.

        Comment


        • #5
          Yep sure, that's all in "real" life. I was wondering what Vray has planned or has to offer that would be similar.
          I'm just guessing we have less limitations in full 3D, isn't it?

          Stan
          3LP Team

          Comment


          • #6
            Originally posted by 3LP View Post
            Yep sure, that's all in "real" life. I was wondering what Vray has planned or has to offer that would be similar.
            Our only plan is to support output to the most popular of these new formats (the ones that survive, anyways).

            Best regards,
            Vlado
            I only act like I know everything, Rogers.

            Comment


            • #7
              There's also otoy's light fields, thats similar too. They said they've got it working with a 15ft by 15ft light field using the vive's lighthouse tracking and a wireless gear VR.

              We have a client who's budget for doing this is a blank chequebook and we cant find anyone who's willing to let them use the tech... they just keep telling us how great their approach is and how well it works in their office.

              Comment


              • #8
                Originally posted by Neilg View Post
                We have a client who's budget for doing this is a blank chequebook and we cant find anyone who's willing to let them use the tech... they just keep telling us how great their approach is and how well it works in their office.
                That's because I'm convinced OTOY is a scam. Friend of a friend worked there with an advanced degree in rendering... they put him to work keeping their website update--never wrote a line of code. When he tried to quit because he wasn't actually doing any rendering they sued him for non-compete. They released "Real Time Raytracing" with AMD... turned out to be real-time compositing of raytraced renders. They announced a partnership with Autodesk--until Autodesk's lawyers told them to can it all that they had done was go to a meeting and watch a presentation not sign any agreements. When the heat was ratcheting up since they had 0 products and no IP but burned millions of gullible investor dollars they bought Octane. When octane hit a roadblock they bought Brigade. They have loads of cash, but you can't just keep buying yourself out of an IP hole.

                It should be noted that this lightfield capture/rendering is different from shademap. It's not just rendering a point cloud it's rendering image vector data off of a plane or sphere. The advantage is you get proper specularity etc from shifting vantage points not just a dense diffuse point cloud like shademap. Theoretically with a sufficiently advanced lightfield volume (or plane) you could refocus between the reflection, refraction and diffuse even though the 'depth' would be identical.
                Last edited by im.thatoneguy; 10-11-2015, 04:38 PM.
                Gavin Greenwalt
                im.thatoneguy[at]gmail.com || Gavin[at]SFStudios.com
                Straightface Studios

                Comment


                • #9
                  Disappointing to hear but I appreciate the insider knowledge.

                  For 6 months we've been getting blanked by every company who claims to be able to do light fields/proper VR. it's madness. Super enthusiastic responses, had NDA's signed so we can talk about the client and the project, go the client looped in and the first couple phonecalls go well, then we try and get a date penned where we can meet up & all sit down to figure out how to go forward and then it's nothing, complete silence.
                  We're just trying to do the introduction and if need be hand over the scene files from the stereoscopic 360's we've done so far.

                  Comment


                  • #10
                    Originally posted by im.thatoneguy View Post
                    It should be noted that this lightfield capture/rendering is different from shademap.
                    Yep I get that, it's more the principle that cameras can share rays from other camera views if they're going along the same vector - I'd imagine shade map could be used to optimize render times somehow instead of you having to render "n" cameras in a sphere.

                    Comment


                    • #11
                      Originally posted by vlado View Post
                      Our only plan is to support output to the most popular of these new formats (the ones that survive, anyways). Best regards, Vlado
                      I understand.
                      Just wondering how you'll asses the "ones that survive"
                      Would that means you'll sit back and watch the game till there is only a few ones left? Would be a shame not to jump in the battle.
                      Vray is top notch on a lot of areas and could easily be on this one as well. Even it it's a proprietary format like vrimg that could be converted at some point to those major formats who survive later on, but at least we wouldn't wait 3 years to see how it turn out.
                      There is plenty of work out there that could benefit from this, it's a bit sad we can't exploit it because we are just waiting for the time to pass by.

                      Stan
                      3LP Team

                      Comment


                      • #12
                        I imagine once there's a working video player for this format that actually is actually real and available, getting vray to export in that format wont take that long.

                        Right now I can think of 5 light field/volume/whatever formats off the top of my head - none of which actually exist in any tangible form outside of heavily edited videos on the web.

                        Comment


                        • #13
                          ive just seen the otoy lightfield presentations. seems like witchcraft. apparently there is a fully navigable batcave of 1 million cubic metres rendered as a lightfield, which, unfortunately, seems to only be shown as a series of screengrabs.

                          if this tech actually works as claimed it seems like the dream solution for vr rendering?

                          im suprised chaos are not all over it.. or am i missing something?

                          Comment


                          • #14
                            I don't really trust anything that Otoy says - there have been too many discrepancies between what they say and what they do.

                            Other than that, we are only interested in VR as far as it is related to rendering. We don't have the resources to dedicate to researching VR technologies as such. However we are open to cooperation with other companies that do, like Nozon and Lytro.

                            Best regards,
                            Vlado
                            I only act like I know everything, Rogers.

                            Comment


                            • #15
                              i appreciate wanting to keep things "rendering only"... however ignoring VR, if i understand this tech (which i dont) it would allow full rendered navigable environments to be created? which can be displayed on modest hardware at high framerates... this has uses way beyond VR, and seems like an extremely powerful way of using a rendering engine. it also seems that "in-render-engine" is the only place it belongs?

                              Comment

                              Working...
                              X