Announcement

Collapse
No announcement yet.

Inverse Lens Camera Quadscopic Concept and Format

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Inverse Lens Camera Quadscopic Concept and Format

    I've been thinking about the problems with 3D viewing. One of the biggest limitations we have is that in viewing a screen, we are pretty much currently limited to "side-by-side" effects in order to produce a sense of depth. However, this really doesn't match our true vision. Our true vision also factors in the Z-axis - looking up and down in conjunction with side-by-side.

    I speculate that this is one of the main reasons people experience nausea when using glasses to view an image.

    For example, if you've ever experienced "sea sickness", it's often the up and down motion of a boat on waves which causes an equilibrium imbalance. However, experienced sailors know that a quick way to stabilize the balance is to keep your focus on the horizon.

    By allowing Z-axis quad cameras, you're allowing the viewer to look both up and down, but also side to side.

    Not only would this relieve nausea, but also provide for a more accurate and immersive 3D experience.

    Click image for larger version

Name:	inverse-lens-camera.jpg
Views:	1
Size:	107.4 KB
ID:	870162

    A new type of camera "lens" would need to be created - similar to a fisheye that infinitely surrounds an object from half a hemisphere on the side of the viewer.
    LunarStudio Architectural Renderings
    HDRSource HDR & sIBL Libraries
    Lunarlog - LunarStudio and HDRSource Blog

  • #2
    Originally posted by jujubee View Post
    . Our true vision also factors in the Z-axis - looking up and down in conjunction with side-by-side.
    Interesting thinking
    our eyes are side by side though, I can't see that using a quad rig would help.

    I think nausea is caused by other factors. frame rate, left/right not matching properly and sometimes you are trying to focus on a subject that has blur and you eye tries to compensate.
    too much interocular plays with your brain and it tries to compensate
    Were you sit plays a big part too, I try and get dead center row 5/6, so that the screen is just out of my peripheral and find this is the sweet spot, for me at least

    Comment


    • #3
      Well, depth works in part (largely) with side-by-side. But I have no doubt that looking slightly up and down shifts objects slightly and factors into balance. It's natural to think depth is side-by-side, but we do use/process both eyes for up and down, otherwise we would be able to move one eye up and down without the other - sort of like some chameleons.

      You are definitely correct that there is a sweet spot of being in the center of where a 3d movie was filmed. Being off from the center - left, right, above, below causes skewing which makes our brains compensate. I don't think my idea is a complete solution by any means but its a step in a direction.

      The other thing about this too, is that a new camera lens in a concave shape (like a backwards camera lens) would allow you look all around half an object instead. The main application would be capturing the extra data for devices that contain gyroscopes and accelerometers such as the iPhone, Droid, etc. when tilting. Why limit ourselves to one axis?

      My apologies if this is in the wishlist section - I realize the idea is a bit much but I wanted to create something "3D" recently for my phone that I could look at from all angles versus being limited to stereoscopy or having to employ a game engine. It would be great if the data could be stored in one simple place.

      It's like the difference between a bump map and a proper displacement map.
      Last edited by jujubee; 22-09-2010, 10:26 AM.
      LunarStudio Architectural Renderings
      HDRSource HDR & sIBL Libraries
      Lunarlog - LunarStudio and HDRSource Blog

      Comment


      • #4
        Adobe is already working on something in the same vein:

        http://www.crunchgear.com/2010/09/22...n-your-images/

        It would seem some sort of eye or head tracking would be involved since there is no other way to distinguish when the viewer is looking up or down.
        Ben Steinert
        pb2ae.com

        Comment


        • #5
          there is no other way to distinguish when the viewer is looking up or down.
          I would imagine there would have to a way to tilt forwards and backwards even if it's very subtle (small degrees of side by side inclination.) I wouldn't think our eyes are always level/horizontal, but our brain just prefers to process it that way.

          The other way to think of it - if you go to lay down to sleep on your side (an extreme example in this case) - do we lose our sense of depth perception?

          Adobe is already working on something in the same vein:
          Interesting. At least I'm not completely stupid or crazy lol. Perhaps stupid for posting it here. Lots of little prisms in a lens did occur to me - but the resolution has to be insane in order to process it clearly. The other thing is that I think it needs to be radial versus checkered as they show - with larger rectangles (or circles) in the middle and smaller ones towards the edges to mimic the shape of the cornea. In fact, it wouldn't have to be rectangles at all - I'd think you could get away with concentric circles.
          Last edited by jujubee; 24-09-2010, 02:24 AM.
          LunarStudio Architectural Renderings
          HDRSource HDR & sIBL Libraries
          Lunarlog - LunarStudio and HDRSource Blog

          Comment

          Working...
          X